Oct 02 11:18:37 crc systemd[1]: Starting Kubernetes Kubelet... Oct 02 11:18:37 crc restorecon[4657]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:37 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:18:38 crc restorecon[4657]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 02 11:18:39 crc kubenswrapper[4658]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 11:18:39 crc kubenswrapper[4658]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 02 11:18:39 crc kubenswrapper[4658]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 11:18:39 crc kubenswrapper[4658]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 11:18:39 crc kubenswrapper[4658]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 02 11:18:39 crc kubenswrapper[4658]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.631086 4658 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640606 4658 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640639 4658 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640648 4658 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640659 4658 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640668 4658 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640677 4658 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640685 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640693 4658 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640701 4658 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640708 4658 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640716 4658 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640724 4658 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640732 4658 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640740 4658 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640761 4658 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640770 4658 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640777 4658 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640785 4658 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640792 4658 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640800 4658 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640808 4658 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640816 4658 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640824 4658 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640831 4658 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640838 4658 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640846 4658 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640857 4658 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640868 4658 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640878 4658 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640889 4658 feature_gate.go:330] unrecognized feature gate: Example Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640899 4658 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640907 4658 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640916 4658 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640924 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640934 4658 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640944 4658 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640954 4658 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640961 4658 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640969 4658 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640982 4658 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.640990 4658 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641001 4658 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641009 4658 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641017 4658 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641025 4658 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641032 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641039 4658 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641048 4658 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641055 4658 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641062 4658 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641070 4658 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641077 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641085 4658 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641093 4658 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641101 4658 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641108 4658 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641115 4658 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641123 4658 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641131 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641138 4658 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641145 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641153 4658 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641161 4658 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641168 4658 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641175 4658 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641183 4658 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641193 4658 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641202 4658 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641211 4658 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641219 4658 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.641226 4658 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642374 4658 flags.go:64] FLAG: --address="0.0.0.0" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642395 4658 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642410 4658 flags.go:64] FLAG: --anonymous-auth="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642421 4658 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642435 4658 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642444 4658 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642456 4658 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642467 4658 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642476 4658 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642486 4658 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642495 4658 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642504 4658 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642513 4658 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642522 4658 flags.go:64] FLAG: --cgroup-root="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642531 4658 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642540 4658 flags.go:64] FLAG: --client-ca-file="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642548 4658 flags.go:64] FLAG: --cloud-config="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642556 4658 flags.go:64] FLAG: --cloud-provider="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642565 4658 flags.go:64] FLAG: --cluster-dns="[]" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642576 4658 flags.go:64] FLAG: --cluster-domain="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642585 4658 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642594 4658 flags.go:64] FLAG: --config-dir="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642602 4658 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642612 4658 flags.go:64] FLAG: --container-log-max-files="5" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642623 4658 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642632 4658 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642641 4658 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642651 4658 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642660 4658 flags.go:64] FLAG: --contention-profiling="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642671 4658 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642680 4658 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642689 4658 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642698 4658 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642709 4658 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642718 4658 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642727 4658 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642735 4658 flags.go:64] FLAG: --enable-load-reader="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642744 4658 flags.go:64] FLAG: --enable-server="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642753 4658 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642764 4658 flags.go:64] FLAG: --event-burst="100" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642774 4658 flags.go:64] FLAG: --event-qps="50" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642783 4658 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642792 4658 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642801 4658 flags.go:64] FLAG: --eviction-hard="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642812 4658 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642820 4658 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642829 4658 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642838 4658 flags.go:64] FLAG: --eviction-soft="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642847 4658 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642855 4658 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642864 4658 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642873 4658 flags.go:64] FLAG: --experimental-mounter-path="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642881 4658 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642890 4658 flags.go:64] FLAG: --fail-swap-on="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642899 4658 flags.go:64] FLAG: --feature-gates="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642910 4658 flags.go:64] FLAG: --file-check-frequency="20s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642919 4658 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642928 4658 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642938 4658 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642947 4658 flags.go:64] FLAG: --healthz-port="10248" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642957 4658 flags.go:64] FLAG: --help="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642965 4658 flags.go:64] FLAG: --hostname-override="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642974 4658 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642983 4658 flags.go:64] FLAG: --http-check-frequency="20s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.642992 4658 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643004 4658 flags.go:64] FLAG: --image-credential-provider-config="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643013 4658 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643023 4658 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643032 4658 flags.go:64] FLAG: --image-service-endpoint="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643041 4658 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643051 4658 flags.go:64] FLAG: --kube-api-burst="100" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643060 4658 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643069 4658 flags.go:64] FLAG: --kube-api-qps="50" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643078 4658 flags.go:64] FLAG: --kube-reserved="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643087 4658 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643095 4658 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643105 4658 flags.go:64] FLAG: --kubelet-cgroups="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643114 4658 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643123 4658 flags.go:64] FLAG: --lock-file="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643131 4658 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643140 4658 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643149 4658 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643162 4658 flags.go:64] FLAG: --log-json-split-stream="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643171 4658 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643180 4658 flags.go:64] FLAG: --log-text-split-stream="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643188 4658 flags.go:64] FLAG: --logging-format="text" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643198 4658 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643207 4658 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643216 4658 flags.go:64] FLAG: --manifest-url="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643225 4658 flags.go:64] FLAG: --manifest-url-header="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643236 4658 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643245 4658 flags.go:64] FLAG: --max-open-files="1000000" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643255 4658 flags.go:64] FLAG: --max-pods="110" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643265 4658 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643274 4658 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643330 4658 flags.go:64] FLAG: --memory-manager-policy="None" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643341 4658 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643350 4658 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643360 4658 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643368 4658 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643387 4658 flags.go:64] FLAG: --node-status-max-images="50" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643396 4658 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643405 4658 flags.go:64] FLAG: --oom-score-adj="-999" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643414 4658 flags.go:64] FLAG: --pod-cidr="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643423 4658 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643437 4658 flags.go:64] FLAG: --pod-manifest-path="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643445 4658 flags.go:64] FLAG: --pod-max-pids="-1" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643455 4658 flags.go:64] FLAG: --pods-per-core="0" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643463 4658 flags.go:64] FLAG: --port="10250" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643472 4658 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643481 4658 flags.go:64] FLAG: --provider-id="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643489 4658 flags.go:64] FLAG: --qos-reserved="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643499 4658 flags.go:64] FLAG: --read-only-port="10255" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643508 4658 flags.go:64] FLAG: --register-node="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643516 4658 flags.go:64] FLAG: --register-schedulable="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643525 4658 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643540 4658 flags.go:64] FLAG: --registry-burst="10" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643549 4658 flags.go:64] FLAG: --registry-qps="5" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643558 4658 flags.go:64] FLAG: --reserved-cpus="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643566 4658 flags.go:64] FLAG: --reserved-memory="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643577 4658 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643587 4658 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643597 4658 flags.go:64] FLAG: --rotate-certificates="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643606 4658 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643615 4658 flags.go:64] FLAG: --runonce="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643624 4658 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643634 4658 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643643 4658 flags.go:64] FLAG: --seccomp-default="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643651 4658 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643660 4658 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643669 4658 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643678 4658 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643687 4658 flags.go:64] FLAG: --storage-driver-password="root" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643696 4658 flags.go:64] FLAG: --storage-driver-secure="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643704 4658 flags.go:64] FLAG: --storage-driver-table="stats" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643713 4658 flags.go:64] FLAG: --storage-driver-user="root" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643722 4658 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643731 4658 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643740 4658 flags.go:64] FLAG: --system-cgroups="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643748 4658 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643762 4658 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643771 4658 flags.go:64] FLAG: --tls-cert-file="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643779 4658 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643790 4658 flags.go:64] FLAG: --tls-min-version="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643799 4658 flags.go:64] FLAG: --tls-private-key-file="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643807 4658 flags.go:64] FLAG: --topology-manager-policy="none" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643816 4658 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643825 4658 flags.go:64] FLAG: --topology-manager-scope="container" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643835 4658 flags.go:64] FLAG: --v="2" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643846 4658 flags.go:64] FLAG: --version="false" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643857 4658 flags.go:64] FLAG: --vmodule="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643867 4658 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.643877 4658 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644072 4658 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644082 4658 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644090 4658 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644099 4658 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644107 4658 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644114 4658 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644122 4658 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644130 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644138 4658 feature_gate.go:330] unrecognized feature gate: Example Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644145 4658 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644153 4658 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644161 4658 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644169 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644176 4658 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644185 4658 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644193 4658 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644201 4658 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644209 4658 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644216 4658 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644224 4658 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644232 4658 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644240 4658 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644248 4658 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644255 4658 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644263 4658 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644271 4658 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644279 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644287 4658 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644316 4658 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644324 4658 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644332 4658 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644341 4658 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644349 4658 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644357 4658 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644364 4658 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644372 4658 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644379 4658 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644387 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644397 4658 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644414 4658 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644422 4658 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644431 4658 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644441 4658 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644451 4658 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644460 4658 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644470 4658 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644479 4658 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644489 4658 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644497 4658 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644505 4658 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644512 4658 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644522 4658 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644531 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644540 4658 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644548 4658 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644556 4658 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644564 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644571 4658 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644579 4658 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644586 4658 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644594 4658 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644601 4658 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644609 4658 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644617 4658 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644625 4658 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644632 4658 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644640 4658 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644649 4658 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644657 4658 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644664 4658 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.644672 4658 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.644700 4658 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.661853 4658 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.661894 4658 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662027 4658 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662039 4658 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662049 4658 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662058 4658 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662068 4658 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662075 4658 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662083 4658 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662091 4658 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662100 4658 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662108 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662118 4658 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662131 4658 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662141 4658 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662149 4658 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662158 4658 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662166 4658 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662174 4658 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662182 4658 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662189 4658 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662197 4658 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662205 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662213 4658 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662220 4658 feature_gate.go:330] unrecognized feature gate: Example Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662228 4658 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662236 4658 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662245 4658 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662253 4658 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662261 4658 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662268 4658 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662276 4658 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662284 4658 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662323 4658 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662331 4658 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662339 4658 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662347 4658 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662355 4658 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662365 4658 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662374 4658 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662383 4658 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662391 4658 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662400 4658 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662408 4658 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662416 4658 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662423 4658 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662431 4658 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662439 4658 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662447 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662455 4658 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662462 4658 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662470 4658 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662478 4658 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662486 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662493 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662501 4658 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662509 4658 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662518 4658 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662525 4658 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662533 4658 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662541 4658 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662548 4658 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662556 4658 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662564 4658 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662572 4658 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662580 4658 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662590 4658 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662600 4658 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662609 4658 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662618 4658 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662628 4658 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662638 4658 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662647 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.662660 4658 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662880 4658 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662892 4658 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662901 4658 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662909 4658 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662917 4658 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662925 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662932 4658 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662940 4658 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662948 4658 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662956 4658 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662964 4658 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662972 4658 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662980 4658 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662987 4658 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.662997 4658 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663005 4658 feature_gate.go:330] unrecognized feature gate: Example Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663014 4658 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663023 4658 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663030 4658 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663038 4658 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663046 4658 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663054 4658 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663061 4658 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663068 4658 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663077 4658 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663084 4658 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663094 4658 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663105 4658 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663114 4658 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663125 4658 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663136 4658 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663147 4658 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663155 4658 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663163 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663171 4658 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663179 4658 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663186 4658 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663194 4658 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663201 4658 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663209 4658 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663217 4658 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663225 4658 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663232 4658 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663240 4658 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663247 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663256 4658 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663263 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663271 4658 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663279 4658 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663347 4658 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663359 4658 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663368 4658 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663377 4658 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663385 4658 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663393 4658 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663402 4658 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663411 4658 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663421 4658 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663429 4658 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663436 4658 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663444 4658 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663452 4658 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663462 4658 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663472 4658 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663480 4658 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663488 4658 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663496 4658 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663504 4658 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663512 4658 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663519 4658 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.663527 4658 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.663538 4658 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.668379 4658 server.go:940] "Client rotation is on, will bootstrap in background" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.675971 4658 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.676103 4658 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.682050 4658 server.go:997] "Starting client certificate rotation" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.682094 4658 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.682340 4658 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-05 08:06:13.934492668 +0000 UTC Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.682523 4658 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2276h47m34.251974856s for next certificate rotation Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.720000 4658 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.722877 4658 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.744093 4658 log.go:25] "Validated CRI v1 runtime API" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.786114 4658 log.go:25] "Validated CRI v1 image API" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.788432 4658 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.797069 4658 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-02-11-13-37-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.797131 4658 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:44 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.826111 4658 manager.go:217] Machine: {Timestamp:2025-10-02 11:18:39.821760708 +0000 UTC m=+0.712914365 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6a661c31-2aab-46f6-9356-aadb249c199d BootID:989d2d6c-e7aa-470a-8c4e-33361ee1def6 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:44 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0a:de:13 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0a:de:13 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a3:ad:01 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:1b:2c:cf Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:68:60:55 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:99:b3:43 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:da:4f:56:06:96:b2 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:82:2e:04:25:db Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.826579 4658 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.826800 4658 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.827464 4658 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.827829 4658 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.827893 4658 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.828330 4658 topology_manager.go:138] "Creating topology manager with none policy" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.828359 4658 container_manager_linux.go:303] "Creating device plugin manager" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.829409 4658 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.829464 4658 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.830411 4658 state_mem.go:36] "Initialized new in-memory state store" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.830573 4658 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.840156 4658 kubelet.go:418] "Attempting to sync node with API server" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.840207 4658 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.840251 4658 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.840275 4658 kubelet.go:324] "Adding apiserver pod source" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.840327 4658 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.856471 4658 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.858441 4658 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.865286 4658 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.866770 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.866804 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:39 crc kubenswrapper[4658]: E1002 11:18:39.866894 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:39 crc kubenswrapper[4658]: E1002 11:18:39.866925 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867387 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867418 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867426 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867434 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867445 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867452 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867459 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867470 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867478 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867487 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867497 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.867928 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.870422 4658 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.870922 4658 server.go:1280] "Started kubelet" Oct 02 11:18:39 crc systemd[1]: Started Kubernetes Kubelet. Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.873122 4658 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.873267 4658 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.874002 4658 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.874096 4658 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.876281 4658 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.876362 4658 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.876512 4658 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 03:12:12.014268208 +0000 UTC Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.876745 4658 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2271h53m32.137532503s for next certificate rotation Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.877433 4658 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.877455 4658 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.877556 4658 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 02 11:18:39 crc kubenswrapper[4658]: E1002 11:18:39.877634 4658 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 11:18:39 crc kubenswrapper[4658]: E1002 11:18:39.878371 4658 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="200ms" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.878543 4658 factory.go:153] Registering CRI-O factory Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.878658 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.878800 4658 factory.go:221] Registration of the crio container factory successfully Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.878910 4658 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.878931 4658 factory.go:55] Registering systemd factory Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.878948 4658 factory.go:221] Registration of the systemd container factory successfully Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.878985 4658 factory.go:103] Registering Raw factory Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.879012 4658 manager.go:1196] Started watching for new ooms in manager Oct 02 11:18:39 crc kubenswrapper[4658]: E1002 11:18:39.879119 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.880231 4658 manager.go:319] Starting recovery of all containers Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.880890 4658 server.go:460] "Adding debug handlers to kubelet server" Oct 02 11:18:39 crc kubenswrapper[4658]: E1002 11:18:39.881063 4658 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aa88b1c55adc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 11:18:39.870897607 +0000 UTC m=+0.762051174,LastTimestamp:2025-10-02 11:18:39.870897607 +0000 UTC m=+0.762051174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.912473 4658 manager.go:324] Recovery completed Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918354 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918438 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918463 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918496 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918518 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918539 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918561 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918582 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918605 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918624 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918646 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918665 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918683 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918710 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918730 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918750 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918769 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918792 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918812 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918836 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918854 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918874 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918896 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918917 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918936 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918957 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.918982 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919003 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919023 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919044 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919061 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919081 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919101 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919121 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919170 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919195 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919239 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919260 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919279 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919326 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919389 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919415 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919443 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919467 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919486 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919506 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919528 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919549 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919568 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919589 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919609 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919628 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919653 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919675 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919697 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919718 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919737 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919759 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919779 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919798 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919817 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919839 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919857 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919876 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919897 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919917 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919936 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919957 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919979 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.919998 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920018 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920038 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920057 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920078 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920097 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920116 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920135 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920155 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920174 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920196 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920217 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920238 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920258 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920276 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920328 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920359 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920379 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920400 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920419 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920440 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920459 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920481 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920501 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920524 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920543 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.920563 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924264 4658 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924336 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924362 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924383 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924403 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924423 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924443 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924463 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924529 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924561 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924582 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924603 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924626 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924672 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924692 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924711 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924733 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924753 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924773 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924794 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924812 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924853 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924871 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924890 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924908 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924929 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924949 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924967 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.924987 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925006 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925026 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925045 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925067 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925085 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925104 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925123 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925140 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925157 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925176 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925195 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925214 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925231 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925250 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925268 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925287 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925334 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925377 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925395 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925416 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925435 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925454 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925471 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925489 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925509 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925528 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925546 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925564 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925583 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925603 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925624 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925642 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925660 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925680 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925700 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925719 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925736 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925756 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925774 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925793 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925811 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925830 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925851 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925870 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925889 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925907 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925926 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925946 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925966 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.925987 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926005 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926026 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926047 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926066 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926084 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926103 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926122 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926141 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926160 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926179 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926209 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926227 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926245 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926263 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926283 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926311 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926324 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926462 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926484 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926503 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926522 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926546 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926566 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926586 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926604 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926645 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926678 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926703 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926752 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926807 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926861 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926893 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926948 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926967 4658 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.926987 4658 reconstruct.go:97] "Volume reconstruction finished" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.927001 4658 reconciler.go:26] "Reconciler: start to sync state" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.930805 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.930911 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.930922 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.936103 4658 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.936125 4658 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.936147 4658 state_mem.go:36] "Initialized new in-memory state store" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.944746 4658 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.947749 4658 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.947818 4658 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 02 11:18:39 crc kubenswrapper[4658]: I1002 11:18:39.947866 4658 kubelet.go:2335] "Starting kubelet main sync loop" Oct 02 11:18:39 crc kubenswrapper[4658]: E1002 11:18:39.947952 4658 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 02 11:18:39 crc kubenswrapper[4658]: W1002 11:18:39.961506 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:39 crc kubenswrapper[4658]: E1002 11:18:39.961597 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:39 crc kubenswrapper[4658]: E1002 11:18:39.978001 4658 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.048186 4658 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.078475 4658 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.080364 4658 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="400ms" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.178644 4658 policy_none.go:49] "None policy: Start" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.178868 4658 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.180330 4658 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.180390 4658 state_mem.go:35] "Initializing new in-memory state store" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.248757 4658 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.279437 4658 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.353429 4658 manager.go:334] "Starting Device Plugin manager" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.353491 4658 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.353510 4658 server.go:79] "Starting device plugin registration server" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.354042 4658 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.354066 4658 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.354343 4658 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.354502 4658 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.354522 4658 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.362169 4658 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.455162 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.456846 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.456912 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.456924 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.456955 4658 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.457666 4658 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.481953 4658 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="800ms" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.648986 4658 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.649120 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.650383 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.650423 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.650437 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.650563 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.650677 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.650724 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.651486 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.651512 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.651520 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.651594 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.651746 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.651794 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.651753 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.651869 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.651881 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.652237 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.652259 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.652270 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.652399 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.652576 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.652631 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653020 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653057 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653070 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653779 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653813 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653825 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653872 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653912 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653940 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.653960 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.654033 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.654059 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.654817 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.654850 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.654860 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.654910 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.654926 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.654937 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.654995 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.655020 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.656235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.656272 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.656283 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.657910 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.659153 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.659207 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.659227 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.659264 4658 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.660171 4658 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738043 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738122 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738167 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738201 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738238 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738273 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738327 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738360 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738388 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738414 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738466 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738492 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738519 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738576 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.738626 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.839848 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.839972 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840010 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840091 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840045 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840162 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840179 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840191 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840218 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840220 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840256 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840347 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840375 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840272 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840403 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840462 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840529 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840615 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840682 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840716 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840748 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840773 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840796 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840815 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840903 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840908 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840920 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840834 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.841011 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.840879 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:18:40 crc kubenswrapper[4658]: W1002 11:18:40.852287 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:40 crc kubenswrapper[4658]: E1002 11:18:40.852450 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.875916 4658 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:40 crc kubenswrapper[4658]: I1002 11:18:40.978786 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.007135 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.016491 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:41 crc kubenswrapper[4658]: W1002 11:18:41.025534 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3c796959004d01bd3bd6ba7ad74eca498e887401159e06d034abf973c47093d0 WatchSource:0}: Error finding container 3c796959004d01bd3bd6ba7ad74eca498e887401159e06d034abf973c47093d0: Status 404 returned error can't find the container with id 3c796959004d01bd3bd6ba7ad74eca498e887401159e06d034abf973c47093d0 Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.037526 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.044136 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:18:41 crc kubenswrapper[4658]: W1002 11:18:41.045851 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d85ffa5799366a61806dffbf89a32e03fa5508080c5e08dcfb322b78d1626681 WatchSource:0}: Error finding container d85ffa5799366a61806dffbf89a32e03fa5508080c5e08dcfb322b78d1626681: Status 404 returned error can't find the container with id d85ffa5799366a61806dffbf89a32e03fa5508080c5e08dcfb322b78d1626681 Oct 02 11:18:41 crc kubenswrapper[4658]: W1002 11:18:41.048533 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-55b04e422e1420ab0d551b5f69e4446d4b7ef3374c849924894562be47d45bed WatchSource:0}: Error finding container 55b04e422e1420ab0d551b5f69e4446d4b7ef3374c849924894562be47d45bed: Status 404 returned error can't find the container with id 55b04e422e1420ab0d551b5f69e4446d4b7ef3374c849924894562be47d45bed Oct 02 11:18:41 crc kubenswrapper[4658]: W1002 11:18:41.060248 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7b48dda6a44916c1da02e84745465bc4f6407f90e8df9064eeb8b835c3532e7d WatchSource:0}: Error finding container 7b48dda6a44916c1da02e84745465bc4f6407f90e8df9064eeb8b835c3532e7d: Status 404 returned error can't find the container with id 7b48dda6a44916c1da02e84745465bc4f6407f90e8df9064eeb8b835c3532e7d Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.060259 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.064384 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.064436 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.064449 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.064475 4658 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:18:41 crc kubenswrapper[4658]: E1002 11:18:41.064927 4658 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 02 11:18:41 crc kubenswrapper[4658]: W1002 11:18:41.066783 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7119e49aff6b283aee789acaa0e6c6c437fe5d77f6d4b3567c9399ff3e33c9fa WatchSource:0}: Error finding container 7119e49aff6b283aee789acaa0e6c6c437fe5d77f6d4b3567c9399ff3e33c9fa: Status 404 returned error can't find the container with id 7119e49aff6b283aee789acaa0e6c6c437fe5d77f6d4b3567c9399ff3e33c9fa Oct 02 11:18:41 crc kubenswrapper[4658]: W1002 11:18:41.074120 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:41 crc kubenswrapper[4658]: E1002 11:18:41.074202 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:41 crc kubenswrapper[4658]: W1002 11:18:41.112990 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:41 crc kubenswrapper[4658]: E1002 11:18:41.113110 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:41 crc kubenswrapper[4658]: W1002 11:18:41.130310 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:41 crc kubenswrapper[4658]: E1002 11:18:41.130404 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:41 crc kubenswrapper[4658]: E1002 11:18:41.283716 4658 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="1.6s" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.865798 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.867676 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.867712 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.867721 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.867746 4658 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:18:41 crc kubenswrapper[4658]: E1002 11:18:41.868152 4658 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.874591 4658 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.956990 4658 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9" exitCode=0 Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.957096 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.957192 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3c796959004d01bd3bd6ba7ad74eca498e887401159e06d034abf973c47093d0"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.957287 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.958580 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.958660 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.958676 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.959932 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.959969 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7119e49aff6b283aee789acaa0e6c6c437fe5d77f6d4b3567c9399ff3e33c9fa"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.960071 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.960930 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.960962 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.960974 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.961845 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.961903 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7b48dda6a44916c1da02e84745465bc4f6407f90e8df9064eeb8b835c3532e7d"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.963556 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.963597 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"55b04e422e1420ab0d551b5f69e4446d4b7ef3374c849924894562be47d45bed"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.963710 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.964587 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.964625 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.964640 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.965674 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.965706 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d85ffa5799366a61806dffbf89a32e03fa5508080c5e08dcfb322b78d1626681"} Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.965818 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.967531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.967596 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:41 crc kubenswrapper[4658]: I1002 11:18:41.967617 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.875141 4658 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:42 crc kubenswrapper[4658]: E1002 11:18:42.884888 4658 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="3.2s" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.971751 4658 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc" exitCode=0 Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.971841 4658 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e" exitCode=0 Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.971873 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc"} Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.971951 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e"} Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.972123 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.973621 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.973677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.973700 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.975685 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"23330935e83f85001f5fdca938b3fda718894207e685d2ac46b8c70606165702"} Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.975864 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.977350 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.977391 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.977413 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.978121 4658 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879" exitCode=0 Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.978197 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879"} Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.978378 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.979595 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.979644 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.979722 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.984201 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180"} Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.984241 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b"} Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.984260 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956"} Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.984261 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.985215 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.985253 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.985271 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.987411 4658 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d" exitCode=0 Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.987457 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d"} Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.987612 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.989013 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.989049 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.989058 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.990655 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.991841 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.991899 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:42 crc kubenswrapper[4658]: I1002 11:18:42.991923 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:43 crc kubenswrapper[4658]: W1002 11:18:43.096217 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:43 crc kubenswrapper[4658]: E1002 11:18:43.096367 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.126095 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:43 crc kubenswrapper[4658]: W1002 11:18:43.169894 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:43 crc kubenswrapper[4658]: E1002 11:18:43.169990 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.300148 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:43 crc kubenswrapper[4658]: W1002 11:18:43.307835 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:43 crc kubenswrapper[4658]: E1002 11:18:43.307910 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:43 crc kubenswrapper[4658]: W1002 11:18:43.418998 4658 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:43 crc kubenswrapper[4658]: E1002 11:18:43.419067 4658 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.469156 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.470849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.470891 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.470902 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.470938 4658 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:18:43 crc kubenswrapper[4658]: E1002 11:18:43.471552 4658 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 02 11:18:43 crc kubenswrapper[4658]: E1002 11:18:43.726653 4658 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aa88b1c55adc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 11:18:39.870897607 +0000 UTC m=+0.762051174,LastTimestamp:2025-10-02 11:18:39.870897607 +0000 UTC m=+0.762051174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.875483 4658 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.995264 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243"} Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.995374 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6"} Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.995399 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5"} Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.995418 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6"} Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.998869 4658 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12" exitCode=0 Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.998978 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12"} Oct 02 11:18:43 crc kubenswrapper[4658]: I1002 11:18:43.999209 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.000927 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.000999 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.001023 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.016998 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.017394 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124"} Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.017434 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a"} Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.017448 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea"} Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.017526 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.018391 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.018419 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.018432 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.018429 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.018582 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.018611 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.854693 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:18:44 crc kubenswrapper[4658]: I1002 11:18:44.874874 4658 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.025979 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"18adce49bd32a849da6616c4e37956ff62d7e617a0906bd8861269080d2f7516"} Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.026215 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.027176 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.027230 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.027246 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.031104 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.031082 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77"} Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.031134 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.031148 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.031162 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71"} Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.031235 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd"} Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.032149 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.032189 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.032208 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.032277 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.032320 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:45 crc kubenswrapper[4658]: I1002 11:18:45.032334 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.039675 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.039694 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba"} Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.039741 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.039768 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d"} Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.039749 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.041284 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.041374 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.041392 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.041982 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.042038 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.042055 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.483712 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.672436 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.674008 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.674071 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.674082 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.674112 4658 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:18:46 crc kubenswrapper[4658]: I1002 11:18:46.771623 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.042008 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.042016 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.043225 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.043270 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.043281 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.043668 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.043721 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.043742 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.202997 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.203252 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.203355 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.204779 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.204826 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:47 crc kubenswrapper[4658]: I1002 11:18:47.204843 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.045410 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.046438 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.046497 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.046514 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.316996 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.317202 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.318525 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.318581 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.318599 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.483370 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.894429 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.895068 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.897112 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.897159 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:48 crc kubenswrapper[4658]: I1002 11:18:48.897176 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:49 crc kubenswrapper[4658]: I1002 11:18:49.047727 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:49 crc kubenswrapper[4658]: I1002 11:18:49.049080 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:49 crc kubenswrapper[4658]: I1002 11:18:49.049122 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:49 crc kubenswrapper[4658]: I1002 11:18:49.049134 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:49 crc kubenswrapper[4658]: I1002 11:18:49.172903 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:49 crc kubenswrapper[4658]: I1002 11:18:49.173182 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:49 crc kubenswrapper[4658]: I1002 11:18:49.174780 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:49 crc kubenswrapper[4658]: I1002 11:18:49.174822 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:49 crc kubenswrapper[4658]: I1002 11:18:49.174833 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:50 crc kubenswrapper[4658]: E1002 11:18:50.362367 4658 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 11:18:52 crc kubenswrapper[4658]: I1002 11:18:52.173289 4658 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 11:18:52 crc kubenswrapper[4658]: I1002 11:18:52.173440 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:18:53 crc kubenswrapper[4658]: I1002 11:18:53.335133 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 02 11:18:53 crc kubenswrapper[4658]: I1002 11:18:53.335418 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:53 crc kubenswrapper[4658]: I1002 11:18:53.336934 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:53 crc kubenswrapper[4658]: I1002 11:18:53.337018 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:53 crc kubenswrapper[4658]: I1002 11:18:53.337072 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.067268 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.069906 4658 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="18adce49bd32a849da6616c4e37956ff62d7e617a0906bd8861269080d2f7516" exitCode=255 Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.069965 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"18adce49bd32a849da6616c4e37956ff62d7e617a0906bd8861269080d2f7516"} Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.070163 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.071195 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.071282 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.071327 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.072116 4658 scope.go:117] "RemoveContainer" containerID="18adce49bd32a849da6616c4e37956ff62d7e617a0906bd8861269080d2f7516" Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.420520 4658 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.420594 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.428736 4658 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 11:18:55 crc kubenswrapper[4658]: I1002 11:18:55.429115 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 11:18:56 crc kubenswrapper[4658]: I1002 11:18:56.075686 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 11:18:56 crc kubenswrapper[4658]: I1002 11:18:56.078858 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3"} Oct 02 11:18:56 crc kubenswrapper[4658]: I1002 11:18:56.079218 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:56 crc kubenswrapper[4658]: I1002 11:18:56.080203 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:56 crc kubenswrapper[4658]: I1002 11:18:56.080347 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:56 crc kubenswrapper[4658]: I1002 11:18:56.080437 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:56 crc kubenswrapper[4658]: I1002 11:18:56.772221 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:57 crc kubenswrapper[4658]: I1002 11:18:57.081810 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:57 crc kubenswrapper[4658]: I1002 11:18:57.082772 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:57 crc kubenswrapper[4658]: I1002 11:18:57.082810 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:57 crc kubenswrapper[4658]: I1002 11:18:57.082818 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.324725 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.324917 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.326432 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.326513 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.326539 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.490143 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.490402 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.491891 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.491958 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.491983 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:18:58 crc kubenswrapper[4658]: I1002 11:18:58.497392 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:18:59 crc kubenswrapper[4658]: I1002 11:18:59.088009 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:18:59 crc kubenswrapper[4658]: I1002 11:18:59.089546 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:18:59 crc kubenswrapper[4658]: I1002 11:18:59.089638 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:18:59 crc kubenswrapper[4658]: I1002 11:18:59.089657 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.362499 4658 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.425969 4658 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.429366 4658 trace.go:236] Trace[1649116879]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 11:18:47.396) (total time: 13033ms): Oct 02 11:19:00 crc kubenswrapper[4658]: Trace[1649116879]: ---"Objects listed" error: 13033ms (11:19:00.429) Oct 02 11:19:00 crc kubenswrapper[4658]: Trace[1649116879]: [13.033197807s] [13.033197807s] END Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.429425 4658 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.430367 4658 trace.go:236] Trace[1229897305]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 11:18:46.848) (total time: 13581ms): Oct 02 11:19:00 crc kubenswrapper[4658]: Trace[1229897305]: ---"Objects listed" error: 13581ms (11:19:00.430) Oct 02 11:19:00 crc kubenswrapper[4658]: Trace[1229897305]: [13.581606592s] [13.581606592s] END Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.430406 4658 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.432013 4658 trace.go:236] Trace[1621782398]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 11:18:47.160) (total time: 13271ms): Oct 02 11:19:00 crc kubenswrapper[4658]: Trace[1621782398]: ---"Objects listed" error: 13270ms (11:19:00.431) Oct 02 11:19:00 crc kubenswrapper[4658]: Trace[1621782398]: [13.271005073s] [13.271005073s] END Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.432062 4658 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.432173 4658 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.434842 4658 trace.go:236] Trace[631229483]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 11:18:49.556) (total time: 10878ms): Oct 02 11:19:00 crc kubenswrapper[4658]: Trace[631229483]: ---"Objects listed" error: 10877ms (11:19:00.434) Oct 02 11:19:00 crc kubenswrapper[4658]: Trace[631229483]: [10.87804179s] [10.87804179s] END Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.434890 4658 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.435364 4658 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.853675 4658 apiserver.go:52] "Watching apiserver" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.857060 4658 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.857471 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.857851 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.857922 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.858028 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.857925 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.858078 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.858092 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.858213 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.858576 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.858644 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.860744 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.860842 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.861422 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.861650 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.862036 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.862722 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.862829 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.862978 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.864656 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.878103 4658 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.893704 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.912894 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.927753 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934522 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934565 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934583 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934602 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934617 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934632 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934649 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934664 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934684 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934698 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934715 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934729 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934744 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934758 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934775 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934795 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934810 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934828 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934870 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934886 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934903 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934921 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934938 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934957 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934974 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.934990 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935009 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935026 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935014 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935044 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935064 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935014 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935082 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935101 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935124 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935140 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935156 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935174 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935192 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935208 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935223 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935233 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935244 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935276 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935345 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935382 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935453 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935482 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935607 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935633 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935662 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935691 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935713 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935735 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935757 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935783 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935806 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935830 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935856 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935881 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935901 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935922 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935945 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935969 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.935988 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936012 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936039 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936061 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936084 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936088 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936117 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936142 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936166 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936192 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936144 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936214 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936244 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936290 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936341 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936370 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936395 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936428 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936448 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936464 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936457 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936509 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936529 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936548 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936576 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936599 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936596 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936617 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936699 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936721 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936728 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936773 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936790 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936825 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936852 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936878 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936902 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936927 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936938 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936955 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.936984 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937009 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937032 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937044 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937056 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937086 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937111 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937138 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937160 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937165 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937183 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937218 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937245 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937274 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937282 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937330 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937367 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937391 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937414 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937421 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937436 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937463 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937484 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937506 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937530 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937556 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937574 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937582 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937600 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937622 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937654 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937684 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937710 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937752 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937786 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937815 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937844 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.937955 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938140 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938407 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938411 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938458 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938503 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938544 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938577 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938598 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938615 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938632 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938651 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938667 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938689 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938708 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938728 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938747 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938763 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938781 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938799 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938816 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938833 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938851 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938867 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938882 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938900 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938918 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938934 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938953 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938969 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938985 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939002 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939017 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939033 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939050 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939069 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939090 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939677 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939720 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938608 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938684 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938704 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938787 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938886 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949402 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.938894 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939064 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.939071 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.940083 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.940464 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.940595 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.940628 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.940858 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.941113 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.940971 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.941347 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.941358 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.941893 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.942501 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.943023 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:19:01.442991722 +0000 UTC m=+22.334145279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949675 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949722 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949749 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949774 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949803 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949825 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949870 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949897 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949918 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949941 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949964 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950014 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950031 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950049 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950075 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950103 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950127 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950148 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950168 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950187 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950207 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950227 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950250 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950274 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950313 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950338 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950357 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950376 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950401 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950465 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950679 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950705 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950707 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950728 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950734 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950779 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950817 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950844 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950866 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950890 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950911 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950973 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.950990 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951046 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951072 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951068 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951107 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951124 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.951206 4658 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.951555 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:01.451535886 +0000 UTC m=+22.342689453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951650 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.943052 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.943099 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.943133 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.943204 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.943399 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.943427 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.943496 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.943586 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.944462 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.945135 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.946087 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.946343 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.946582 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.946615 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.946797 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.947946 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.948225 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949150 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.949323 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951168 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951279 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951274 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.942996 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.952021 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.952842 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.953169 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.953881 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.954221 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.954261 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.954336 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.951152 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955233 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955255 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955275 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955327 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955355 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955392 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955449 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955478 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955503 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955531 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955480 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955612 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955852 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955878 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.955789 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.954542 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.956193 4658 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.956473 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:01.456444907 +0000 UTC m=+22.347598674 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.957533 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.962781 4658 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.963594 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.965859 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967686 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967848 4658 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967870 4658 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967886 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967897 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967909 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967919 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967930 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967941 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967951 4658 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967962 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967972 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967984 4658 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.967994 4658 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968004 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968014 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968024 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968035 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968045 4658 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968054 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968064 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968074 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968085 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968096 4658 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968109 4658 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968121 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968134 4658 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968146 4658 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968156 4658 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968165 4658 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968176 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968186 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968197 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968210 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968221 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968231 4658 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968244 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968255 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968264 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968274 4658 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968286 4658 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968321 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968344 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968356 4658 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968366 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968377 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968393 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968411 4658 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968424 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968436 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968453 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968466 4658 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968480 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968492 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968508 4658 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968521 4658 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968535 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968549 4658 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968559 4658 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968568 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968578 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968588 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968600 4658 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968612 4658 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968624 4658 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968636 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968649 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968659 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968671 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968680 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968692 4658 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968701 4658 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968710 4658 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968701 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968724 4658 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968722 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968749 4658 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968774 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968790 4658 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.968806 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.969039 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.969239 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.969284 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.966138 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.969955 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.970240 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.972800 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.972929 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.973025 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.973071 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.972827 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.973272 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.973492 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.973543 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.973824 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.973831 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.973871 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.973893 4658 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.973997 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:01.473944247 +0000 UTC m=+22.365097844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.974276 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.974470 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.974528 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.974904 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.974931 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.974977 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.975000 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.974162 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.975236 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.975492 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.975572 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.975712 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.975780 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.975926 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.976067 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.976099 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.976113 4658 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.976141 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: E1002 11:19:00.976170 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:01.476150165 +0000 UTC m=+22.367303942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.976272 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.976430 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.976855 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.976891 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.977237 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.946817 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.946828 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.977369 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.977383 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.977637 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.977898 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.977910 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.947158 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.978468 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.978472 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.979051 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.979460 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.947220 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.979856 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.947279 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.980143 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.980425 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.981052 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.981372 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.981399 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.981527 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.981812 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.983541 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.984429 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.984451 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.985004 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.985556 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.985759 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.986179 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.989482 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.989930 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.990691 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.990987 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.990998 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.991765 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.991874 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.991904 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.992747 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.995400 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.995699 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.998631 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.999106 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.999216 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:00 crc kubenswrapper[4658]: I1002 11:19:00.999208 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:00.999261 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:00.999571 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:00.999599 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:00.999606 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:00.999631 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:00.999652 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:00.999897 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:00.999922 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.000027 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.000194 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.000346 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.000412 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.000977 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.001199 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.002064 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.002130 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.002448 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.002504 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.002803 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.002907 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.002965 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.003196 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.004600 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.004688 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.004728 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.005839 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.005940 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.006028 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.008218 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.008747 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.012341 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.017480 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.025701 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.030895 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.031118 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.043417 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.053345 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.069969 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.070342 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.070570 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.070714 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.070821 4658 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.070915 4658 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071008 4658 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071105 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071206 4658 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071314 4658 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071398 4658 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071498 4658 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071571 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071643 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071750 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071823 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071888 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.071943 4658 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072000 4658 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072052 4658 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072140 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072207 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072259 4658 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072331 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072386 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072449 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072537 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072599 4658 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072662 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072728 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072791 4658 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072875 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072949 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073028 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073105 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073181 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073240 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073320 4658 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073401 4658 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073472 4658 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073529 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073584 4658 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073646 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073716 4658 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073773 4658 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073844 4658 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072939 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.073915 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074077 4658 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.072886 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074110 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074162 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074178 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074194 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074207 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074221 4658 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074233 4658 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074249 4658 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074261 4658 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074273 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074282 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074309 4658 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074319 4658 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074328 4658 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074337 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074346 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074356 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074365 4658 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074374 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074385 4658 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074394 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074404 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074414 4658 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074424 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074435 4658 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074446 4658 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074457 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074472 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074482 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074495 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074505 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074516 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074525 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074535 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074545 4658 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074554 4658 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074564 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074573 4658 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074584 4658 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074594 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074604 4658 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074613 4658 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074622 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074631 4658 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074640 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074651 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074662 4658 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074672 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074682 4658 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074691 4658 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074700 4658 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074709 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074720 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074729 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074738 4658 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074748 4658 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074757 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074767 4658 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074776 4658 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074783 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074792 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074802 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074811 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074820 4658 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074829 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074838 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074847 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074857 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074866 4658 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074874 4658 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.074883 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.172889 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.183248 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.186201 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:19:01 crc kubenswrapper[4658]: W1002 11:19:01.199652 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-c4e8899cb5a47c65e4263f2693f6cf013d2c3837f0c80b962d4352ec0d3a33ec WatchSource:0}: Error finding container c4e8899cb5a47c65e4263f2693f6cf013d2c3837f0c80b962d4352ec0d3a33ec: Status 404 returned error can't find the container with id c4e8899cb5a47c65e4263f2693f6cf013d2c3837f0c80b962d4352ec0d3a33ec Oct 02 11:19:01 crc kubenswrapper[4658]: W1002 11:19:01.204510 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-2eb7d5cda6605bd51d731abdb23492b1708857547f4b1a6b401c0c6e0273287c WatchSource:0}: Error finding container 2eb7d5cda6605bd51d731abdb23492b1708857547f4b1a6b401c0c6e0273287c: Status 404 returned error can't find the container with id 2eb7d5cda6605bd51d731abdb23492b1708857547f4b1a6b401c0c6e0273287c Oct 02 11:19:01 crc kubenswrapper[4658]: W1002 11:19:01.207727 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-bd50020b6e91fe1270723978965ce1607ef6fe6776d9c489df9e0bbe0a5c3330 WatchSource:0}: Error finding container bd50020b6e91fe1270723978965ce1607ef6fe6776d9c489df9e0bbe0a5c3330: Status 404 returned error can't find the container with id bd50020b6e91fe1270723978965ce1607ef6fe6776d9c489df9e0bbe0a5c3330 Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.387480 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.402473 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.403017 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.407332 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.420137 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.440846 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.461107 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.475278 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.478159 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.478227 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478260 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:19:02.478238682 +0000 UTC m=+23.369392249 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478319 4658 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.478309 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478362 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:02.478350676 +0000 UTC m=+23.369504243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.478378 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.478398 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478460 4658 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478499 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:02.47849275 +0000 UTC m=+23.369646317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478500 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478529 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478541 4658 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478579 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478614 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478626 4658 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478593 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:02.478577783 +0000 UTC m=+23.369731350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.478690 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:02.478674716 +0000 UTC m=+23.369828283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.493212 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.510142 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.533646 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.553160 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.566689 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.582928 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.602042 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.606143 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-d9dfl"] Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.606478 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d9dfl" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.608027 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.609494 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.611923 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.615450 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.627029 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.639440 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.648262 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.665385 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.678485 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.693074 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.706127 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.715991 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.781536 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9423545-b965-4de6-86b1-5af8bdf55a24-hosts-file\") pod \"node-resolver-d9dfl\" (UID: \"e9423545-b965-4de6-86b1-5af8bdf55a24\") " pod="openshift-dns/node-resolver-d9dfl" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.781879 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6vx5\" (UniqueName: \"kubernetes.io/projected/e9423545-b965-4de6-86b1-5af8bdf55a24-kube-api-access-m6vx5\") pod \"node-resolver-d9dfl\" (UID: \"e9423545-b965-4de6-86b1-5af8bdf55a24\") " pod="openshift-dns/node-resolver-d9dfl" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.882451 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6vx5\" (UniqueName: \"kubernetes.io/projected/e9423545-b965-4de6-86b1-5af8bdf55a24-kube-api-access-m6vx5\") pod \"node-resolver-d9dfl\" (UID: \"e9423545-b965-4de6-86b1-5af8bdf55a24\") " pod="openshift-dns/node-resolver-d9dfl" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.882502 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9423545-b965-4de6-86b1-5af8bdf55a24-hosts-file\") pod \"node-resolver-d9dfl\" (UID: \"e9423545-b965-4de6-86b1-5af8bdf55a24\") " pod="openshift-dns/node-resolver-d9dfl" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.882613 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9423545-b965-4de6-86b1-5af8bdf55a24-hosts-file\") pod \"node-resolver-d9dfl\" (UID: \"e9423545-b965-4de6-86b1-5af8bdf55a24\") " pod="openshift-dns/node-resolver-d9dfl" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.903555 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6vx5\" (UniqueName: \"kubernetes.io/projected/e9423545-b965-4de6-86b1-5af8bdf55a24-kube-api-access-m6vx5\") pod \"node-resolver-d9dfl\" (UID: \"e9423545-b965-4de6-86b1-5af8bdf55a24\") " pod="openshift-dns/node-resolver-d9dfl" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.918132 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d9dfl" Oct 02 11:19:01 crc kubenswrapper[4658]: W1002 11:19:01.929830 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9423545_b965_4de6_86b1_5af8bdf55a24.slice/crio-4dd0ffe957cda0ad9f60935d1e8799737096b790e9927273b94b680e4ea12d0e WatchSource:0}: Error finding container 4dd0ffe957cda0ad9f60935d1e8799737096b790e9927273b94b680e4ea12d0e: Status 404 returned error can't find the container with id 4dd0ffe957cda0ad9f60935d1e8799737096b790e9927273b94b680e4ea12d0e Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.948105 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.948215 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.948354 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:01 crc kubenswrapper[4658]: E1002 11:19:01.948435 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.952852 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.953505 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.954628 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.955364 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.956039 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.956640 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.957403 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.958049 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.958831 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.961337 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.961998 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.964272 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.965738 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.969243 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.969907 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.971169 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.971902 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.972394 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.973772 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.974756 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.975941 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.976883 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.977485 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.978951 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.979511 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.985938 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.987027 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.988256 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.988888 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.989424 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.989882 4658 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.989982 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.992634 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.993237 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.993870 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 02 11:19:01 crc kubenswrapper[4658]: I1002 11:19:01.999368 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.000190 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.000813 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.005061 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.005911 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.010177 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.010971 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.012191 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.013450 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.013989 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.015245 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.015879 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.017431 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.019842 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.020509 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.021651 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.022316 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.023534 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.024106 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.099120 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca"} Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.099179 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b"} Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.099195 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2eb7d5cda6605bd51d731abdb23492b1708857547f4b1a6b401c0c6e0273287c"} Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.100489 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d9dfl" event={"ID":"e9423545-b965-4de6-86b1-5af8bdf55a24","Type":"ContainerStarted","Data":"4dd0ffe957cda0ad9f60935d1e8799737096b790e9927273b94b680e4ea12d0e"} Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.101768 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b"} Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.101793 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c4e8899cb5a47c65e4263f2693f6cf013d2c3837f0c80b962d4352ec0d3a33ec"} Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.104383 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.106615 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.109904 4658 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3" exitCode=255 Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.109970 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3"} Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.110011 4658 scope.go:117] "RemoveContainer" containerID="18adce49bd32a849da6616c4e37956ff62d7e617a0906bd8861269080d2f7516" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.113500 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bd50020b6e91fe1270723978965ce1607ef6fe6776d9c489df9e0bbe0a5c3330"} Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.118735 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.133812 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.148318 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.160803 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.175480 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.189269 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.200561 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.215881 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.230913 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.244191 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.267081 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.279943 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.296395 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.296651 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.296936 4658 scope.go:117] "RemoveContainer" containerID="f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3" Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.297096 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.326442 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.342079 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-thtgx"] Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.342488 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-pnjp5"] Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.342689 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.342906 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.359261 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2t8w8"] Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.359439 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.359812 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.359947 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.370696 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.370953 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.371000 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.370963 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.371052 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.371205 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.371221 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.371370 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.373409 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.373596 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.373748 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.374729 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.375944 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.383035 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.383156 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-fnfts"] Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.382940 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.383774 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.384918 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.389499 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.392385 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.421387 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.452371 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.476972 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.487802 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.487911 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-cni-binary-copy\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.487977 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:19:04.487951267 +0000 UTC m=+25.379104834 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488031 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-var-lib-cni-multus\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488130 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-systemd-units\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488157 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-systemd\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488181 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-script-lib\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488205 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8hnd\" (UniqueName: \"kubernetes.io/projected/dea12458-2637-446e-b388-4f139b3fd000-kube-api-access-b8hnd\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488227 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53173b86-be4f-4b39-8f70-f7282ab529fb-mcd-auth-proxy-config\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488247 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-cnibin\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488279 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-system-cni-dir\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488318 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-cnibin\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488341 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/894543ca-6e44-42e8-b41b-4578646d527f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488359 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-etc-openvswitch\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488373 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dea12458-2637-446e-b388-4f139b3fd000-ovn-node-metrics-cert\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488413 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-conf-dir\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488433 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-run-netns\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488450 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-os-release\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488469 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-ovn-kubernetes\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488484 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-env-overrides\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488500 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-socket-dir-parent\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488515 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-var-lib-kubelet\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488531 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-etc-kubernetes\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488640 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-system-cni-dir\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488681 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-var-lib-cni-bin\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488708 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-config\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488728 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-run-k8s-cni-cncf-io\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488744 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-daemon-config\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488757 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-netd\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488775 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488794 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb7ft\" (UniqueName: \"kubernetes.io/projected/894543ca-6e44-42e8-b41b-4578646d527f-kube-api-access-gb7ft\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488814 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-cni-dir\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488852 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-run-multus-certs\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488869 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-var-lib-openvswitch\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488928 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-slash\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488953 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488971 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-hostroot\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.488990 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-bin\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489004 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-os-release\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489028 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489047 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489069 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-openvswitch\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489093 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-ovn\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489107 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-log-socket\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489122 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/53173b86-be4f-4b39-8f70-f7282ab529fb-rootfs\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489133 4658 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489142 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489141 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53173b86-be4f-4b39-8f70-f7282ab529fb-proxy-tls\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489175 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489189 4658 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489203 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:04.489187746 +0000 UTC m=+25.380341313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489249 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-netns\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489288 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489322 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:04.489303169 +0000 UTC m=+25.380456736 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489349 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hqhf\" (UniqueName: \"kubernetes.io/projected/53173b86-be4f-4b39-8f70-f7282ab529fb-kube-api-access-7hqhf\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489361 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489374 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489385 4658 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489418 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:04.489409462 +0000 UTC m=+25.380563029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489383 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptmh\" (UniqueName: \"kubernetes.io/projected/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-kube-api-access-7ptmh\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489450 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489466 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/894543ca-6e44-42e8-b41b-4578646d527f-cni-binary-copy\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489483 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-node-log\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.489497 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-kubelet\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489579 4658 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.489614 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:04.489607339 +0000 UTC m=+25.380760906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.505438 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.523961 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.541679 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.571484 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.584794 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592402 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/894543ca-6e44-42e8-b41b-4578646d527f-cni-binary-copy\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592547 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-kubelet\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592586 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-node-log\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592616 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-var-lib-cni-multus\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592642 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-cni-binary-copy\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592658 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-systemd\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592660 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-kubelet\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592677 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-script-lib\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592772 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8hnd\" (UniqueName: \"kubernetes.io/projected/dea12458-2637-446e-b388-4f139b3fd000-kube-api-access-b8hnd\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592799 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53173b86-be4f-4b39-8f70-f7282ab529fb-mcd-auth-proxy-config\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592824 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-systemd-units\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592845 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-cnibin\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592867 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/894543ca-6e44-42e8-b41b-4578646d527f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592890 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-etc-openvswitch\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592913 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dea12458-2637-446e-b388-4f139b3fd000-ovn-node-metrics-cert\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592932 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-system-cni-dir\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592951 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-cnibin\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592969 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-conf-dir\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.592991 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-os-release\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593010 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-run-netns\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593037 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-socket-dir-parent\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593056 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-var-lib-kubelet\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593076 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-etc-kubernetes\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593094 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/894543ca-6e44-42e8-b41b-4578646d527f-cni-binary-copy\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593098 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-ovn-kubernetes\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593127 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-ovn-kubernetes\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593150 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-env-overrides\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593176 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-config\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593193 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-system-cni-dir\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593215 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-var-lib-cni-bin\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593234 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-run-k8s-cni-cncf-io\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593253 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-daemon-config\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593273 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb7ft\" (UniqueName: \"kubernetes.io/projected/894543ca-6e44-42e8-b41b-4578646d527f-kube-api-access-gb7ft\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.593312 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-cni-dir\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.594761 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-node-log\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.594827 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-var-lib-cni-multus\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.594912 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-systemd\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595070 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-script-lib\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595110 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-run-multus-certs\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595140 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-var-lib-openvswitch\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595159 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-netd\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595177 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595196 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-slash\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595240 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-slash\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595270 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-run-multus-certs\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595273 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53173b86-be4f-4b39-8f70-f7282ab529fb-mcd-auth-proxy-config\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595312 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-var-lib-openvswitch\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595338 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-netd\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595341 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-systemd-units\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595381 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-cnibin\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595410 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595434 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-hostroot\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595453 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-bin\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595471 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-os-release\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595489 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-openvswitch\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595506 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-ovn\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595521 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-log-socket\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595539 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/53173b86-be4f-4b39-8f70-f7282ab529fb-rootfs\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595579 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/53173b86-be4f-4b39-8f70-f7282ab529fb-rootfs\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595606 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595629 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-hostroot\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595649 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-bin\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595769 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/894543ca-6e44-42e8-b41b-4578646d527f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595805 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-etc-openvswitch\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595984 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-os-release\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596047 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53173b86-be4f-4b39-8f70-f7282ab529fb-proxy-tls\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596071 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-netns\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596092 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hqhf\" (UniqueName: \"kubernetes.io/projected/53173b86-be4f-4b39-8f70-f7282ab529fb-kube-api-access-7hqhf\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596110 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptmh\" (UniqueName: \"kubernetes.io/projected/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-kube-api-access-7ptmh\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596253 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-system-cni-dir\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596304 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-cnibin\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596331 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-conf-dir\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596370 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-os-release\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596374 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-cni-binary-copy\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.595882 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/894543ca-6e44-42e8-b41b-4578646d527f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596475 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-socket-dir-parent\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596508 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-var-lib-kubelet\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596532 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-etc-kubernetes\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596550 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-run-netns\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596578 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-ovn\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596599 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-openvswitch\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596624 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-var-lib-cni-bin\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596639 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-log-socket\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596930 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-env-overrides\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.596977 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-system-cni-dir\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.597007 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-netns\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.597592 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-daemon-config\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.597630 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-host-run-k8s-cni-cncf-io\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.598653 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-config\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.598844 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-multus-cni-dir\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.603118 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dea12458-2637-446e-b388-4f139b3fd000-ovn-node-metrics-cert\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.606778 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53173b86-be4f-4b39-8f70-f7282ab529fb-proxy-tls\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.614134 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8hnd\" (UniqueName: \"kubernetes.io/projected/dea12458-2637-446e-b388-4f139b3fd000-kube-api-access-b8hnd\") pod \"ovnkube-node-2t8w8\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.619183 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb7ft\" (UniqueName: \"kubernetes.io/projected/894543ca-6e44-42e8-b41b-4578646d527f-kube-api-access-gb7ft\") pod \"multus-additional-cni-plugins-fnfts\" (UID: \"894543ca-6e44-42e8-b41b-4578646d527f\") " pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.619352 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hqhf\" (UniqueName: \"kubernetes.io/projected/53173b86-be4f-4b39-8f70-f7282ab529fb-kube-api-access-7hqhf\") pod \"machine-config-daemon-pnjp5\" (UID: \"53173b86-be4f-4b39-8f70-f7282ab529fb\") " pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.619673 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptmh\" (UniqueName: \"kubernetes.io/projected/69a005aa-c7db-4d46-968b-8a9a0c00bbd5-kube-api-access-7ptmh\") pod \"multus-thtgx\" (UID: \"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\") " pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.624706 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.645600 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.654439 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-thtgx" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.662588 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: W1002 11:19:02.666757 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a005aa_c7db_4d46_968b_8a9a0c00bbd5.slice/crio-a71ad52d2ce650b774ae201c1708f5031f4e596d16687365bebc8cc70dd83761 WatchSource:0}: Error finding container a71ad52d2ce650b774ae201c1708f5031f4e596d16687365bebc8cc70dd83761: Status 404 returned error can't find the container with id a71ad52d2ce650b774ae201c1708f5031f4e596d16687365bebc8cc70dd83761 Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.673421 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.679307 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.681658 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:02 crc kubenswrapper[4658]: W1002 11:19:02.690401 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53173b86_be4f_4b39_8f70_f7282ab529fb.slice/crio-877686714f7b05ab7000483533949bba64f38c12161f4df0a59efcbf609a30c4 WatchSource:0}: Error finding container 877686714f7b05ab7000483533949bba64f38c12161f4df0a59efcbf609a30c4: Status 404 returned error can't find the container with id 877686714f7b05ab7000483533949bba64f38c12161f4df0a59efcbf609a30c4 Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.693237 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fnfts" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.695330 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18adce49bd32a849da6616c4e37956ff62d7e617a0906bd8861269080d2f7516\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:18:55Z\\\",\\\"message\\\":\\\"W1002 11:18:44.243137 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 11:18:44.243638 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759403924 cert, and key in /tmp/serving-cert-1187295998/serving-signer.crt, /tmp/serving-cert-1187295998/serving-signer.key\\\\nI1002 11:18:44.646447 1 observer_polling.go:159] Starting file observer\\\\nW1002 11:18:44.650716 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:18:44.650945 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:18:44.656565 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1187295998/tls.crt::/tmp/serving-cert-1187295998/tls.key\\\\\\\"\\\\nF1002 11:18:54.976099 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.709336 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:02 crc kubenswrapper[4658]: I1002 11:19:02.948210 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:02 crc kubenswrapper[4658]: E1002 11:19:02.948404 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.118796 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7"} Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.118870 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"877686714f7b05ab7000483533949bba64f38c12161f4df0a59efcbf609a30c4"} Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.120385 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-thtgx" event={"ID":"69a005aa-c7db-4d46-968b-8a9a0c00bbd5","Type":"ContainerStarted","Data":"fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385"} Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.120429 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-thtgx" event={"ID":"69a005aa-c7db-4d46-968b-8a9a0c00bbd5","Type":"ContainerStarted","Data":"a71ad52d2ce650b774ae201c1708f5031f4e596d16687365bebc8cc70dd83761"} Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.122355 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.125166 4658 scope.go:117] "RemoveContainer" containerID="f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3" Oct 02 11:19:03 crc kubenswrapper[4658]: E1002 11:19:03.125357 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.126196 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60" exitCode=0 Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.126287 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60"} Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.126338 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"99952b290a04ac2328b8df7609f76d5d287fa5ccad3f5e0120a0de11aadaf9b9"} Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.127668 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d9dfl" event={"ID":"e9423545-b965-4de6-86b1-5af8bdf55a24","Type":"ContainerStarted","Data":"76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1"} Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.130088 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerStarted","Data":"e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6"} Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.130124 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerStarted","Data":"dbe13b7647a52615c650e530335a5899b017bbc6312554a8d5e18a16357678e1"} Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.146431 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.165549 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.181330 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.197982 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.211946 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.227660 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.241809 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.258490 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.275458 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18adce49bd32a849da6616c4e37956ff62d7e617a0906bd8861269080d2f7516\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:18:55Z\\\",\\\"message\\\":\\\"W1002 11:18:44.243137 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 11:18:44.243638 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759403924 cert, and key in /tmp/serving-cert-1187295998/serving-signer.crt, /tmp/serving-cert-1187295998/serving-signer.key\\\\nI1002 11:18:44.646447 1 observer_polling.go:159] Starting file observer\\\\nW1002 11:18:44.650716 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:18:44.650945 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:18:44.656565 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1187295998/tls.crt::/tmp/serving-cert-1187295998/tls.key\\\\\\\"\\\\nF1002 11:18:54.976099 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.290255 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.307685 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.324669 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.342021 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.357614 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.382566 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.388809 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.407181 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.422360 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.467452 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.478809 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.508847 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.560316 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.637776 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.652365 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.672510 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.694793 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.708015 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.736743 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.756227 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.781276 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.797779 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.811926 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.835818 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.854960 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.871873 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.889320 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.905499 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.922638 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.938906 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.948418 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.948436 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:03 crc kubenswrapper[4658]: E1002 11:19:03.948618 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:03 crc kubenswrapper[4658]: E1002 11:19:03.948823 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.966715 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.981245 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:03 crc kubenswrapper[4658]: I1002 11:19:03.994611 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.018305 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.136804 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9"} Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.136845 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02"} Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.136855 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc"} Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.136864 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6"} Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.136873 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da"} Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.136884 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2"} Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.139052 4658 generic.go:334] "Generic (PLEG): container finished" podID="894543ca-6e44-42e8-b41b-4578646d527f" containerID="e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6" exitCode=0 Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.139133 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerDied","Data":"e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6"} Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.140791 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2"} Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.142924 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458"} Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.155340 4658 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.157567 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.173803 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.192850 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.212422 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.225820 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.245354 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.258424 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.273149 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.294772 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.309157 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.327836 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.345377 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.363979 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.382088 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.406064 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.422501 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.462983 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.503941 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.513752 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.513901 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:19:08.513883369 +0000 UTC m=+29.405036936 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.513953 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.514003 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.514032 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.514054 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514132 4658 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514171 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:08.514163036 +0000 UTC m=+29.405316603 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514249 4658 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514288 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:08.51427978 +0000 UTC m=+29.405433347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514374 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514397 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514410 4658 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514438 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:08.514430715 +0000 UTC m=+29.405584282 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514488 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514502 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514511 4658 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.514537 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:08.514529518 +0000 UTC m=+29.405683085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.545650 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.586211 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.625337 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.666843 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.706983 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.717071 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-nwq8l"] Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.717455 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.735342 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.755862 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.776119 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.795767 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.816993 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbtx4\" (UniqueName: \"kubernetes.io/projected/1f23292d-4f7c-4850-bd3d-895a85ec5392-kube-api-access-zbtx4\") pod \"node-ca-nwq8l\" (UID: \"1f23292d-4f7c-4850-bd3d-895a85ec5392\") " pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.817074 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f23292d-4f7c-4850-bd3d-895a85ec5392-host\") pod \"node-ca-nwq8l\" (UID: \"1f23292d-4f7c-4850-bd3d-895a85ec5392\") " pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.817143 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1f23292d-4f7c-4850-bd3d-895a85ec5392-serviceca\") pod \"node-ca-nwq8l\" (UID: \"1f23292d-4f7c-4850-bd3d-895a85ec5392\") " pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.825474 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.868662 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.903822 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.917650 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f23292d-4f7c-4850-bd3d-895a85ec5392-host\") pod \"node-ca-nwq8l\" (UID: \"1f23292d-4f7c-4850-bd3d-895a85ec5392\") " pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.917791 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1f23292d-4f7c-4850-bd3d-895a85ec5392-serviceca\") pod \"node-ca-nwq8l\" (UID: \"1f23292d-4f7c-4850-bd3d-895a85ec5392\") " pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.917814 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbtx4\" (UniqueName: \"kubernetes.io/projected/1f23292d-4f7c-4850-bd3d-895a85ec5392-kube-api-access-zbtx4\") pod \"node-ca-nwq8l\" (UID: \"1f23292d-4f7c-4850-bd3d-895a85ec5392\") " pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.917746 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f23292d-4f7c-4850-bd3d-895a85ec5392-host\") pod \"node-ca-nwq8l\" (UID: \"1f23292d-4f7c-4850-bd3d-895a85ec5392\") " pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.919220 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1f23292d-4f7c-4850-bd3d-895a85ec5392-serviceca\") pod \"node-ca-nwq8l\" (UID: \"1f23292d-4f7c-4850-bd3d-895a85ec5392\") " pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.943971 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:04Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.948076 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:04 crc kubenswrapper[4658]: E1002 11:19:04.948227 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:04 crc kubenswrapper[4658]: I1002 11:19:04.974463 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbtx4\" (UniqueName: \"kubernetes.io/projected/1f23292d-4f7c-4850-bd3d-895a85ec5392-kube-api-access-zbtx4\") pod \"node-ca-nwq8l\" (UID: \"1f23292d-4f7c-4850-bd3d-895a85ec5392\") " pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.017894 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.048466 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.077265 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nwq8l" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.086682 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: W1002 11:19:05.099075 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f23292d_4f7c_4850_bd3d_895a85ec5392.slice/crio-2cabc34503943db907fbc9952db221350d84f79669e1549e474c720adce4c107 WatchSource:0}: Error finding container 2cabc34503943db907fbc9952db221350d84f79669e1549e474c720adce4c107: Status 404 returned error can't find the container with id 2cabc34503943db907fbc9952db221350d84f79669e1549e474c720adce4c107 Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.130877 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.146704 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nwq8l" event={"ID":"1f23292d-4f7c-4850-bd3d-895a85ec5392","Type":"ContainerStarted","Data":"2cabc34503943db907fbc9952db221350d84f79669e1549e474c720adce4c107"} Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.148185 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerStarted","Data":"d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e"} Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.164186 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.206781 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.254283 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.308053 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.323243 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.363589 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.401225 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.443812 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.482680 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.529236 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.567801 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.604722 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.649862 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.682414 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.729494 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.766733 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.802438 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.844546 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.883546 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.926798 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.948881 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.948974 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:05 crc kubenswrapper[4658]: E1002 11:19:05.949029 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:05 crc kubenswrapper[4658]: E1002 11:19:05.949698 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:05 crc kubenswrapper[4658]: I1002 11:19:05.962998 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:05Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.006630 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.043453 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.115910 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.130150 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.152200 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nwq8l" event={"ID":"1f23292d-4f7c-4850-bd3d-895a85ec5392","Type":"ContainerStarted","Data":"36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261"} Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.154156 4658 generic.go:334] "Generic (PLEG): container finished" podID="894543ca-6e44-42e8-b41b-4578646d527f" containerID="d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e" exitCode=0 Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.154214 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerDied","Data":"d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e"} Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.174233 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.206111 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.247381 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.284926 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.324804 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.366893 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.406583 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.444289 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.489524 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.532576 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.577988 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.608282 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.646813 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.689933 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.732908 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.769432 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.806695 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.835637 4658 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.841563 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.841600 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.841610 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.841694 4658 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.847741 4658 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.847965 4658 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.848903 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.848929 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.848939 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.848955 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.848968 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:06Z","lastTransitionTime":"2025-10-02T11:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:06 crc kubenswrapper[4658]: E1002 11:19:06.861386 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.865586 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.865622 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.865631 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.865645 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.865654 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:06Z","lastTransitionTime":"2025-10-02T11:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:06 crc kubenswrapper[4658]: E1002 11:19:06.886970 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.890714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.890751 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.890760 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.890774 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.890784 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:06Z","lastTransitionTime":"2025-10-02T11:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:06 crc kubenswrapper[4658]: E1002 11:19:06.902519 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.905675 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.905706 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.905714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.905728 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.905738 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:06Z","lastTransitionTime":"2025-10-02T11:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:06 crc kubenswrapper[4658]: E1002 11:19:06.919014 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.923336 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.923375 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.923383 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.923398 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.923406 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:06Z","lastTransitionTime":"2025-10-02T11:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:06 crc kubenswrapper[4658]: E1002 11:19:06.934689 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:06Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:06 crc kubenswrapper[4658]: E1002 11:19:06.934803 4658 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.936464 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.936497 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.936507 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.936520 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.936533 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:06Z","lastTransitionTime":"2025-10-02T11:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:06 crc kubenswrapper[4658]: I1002 11:19:06.949038 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:06 crc kubenswrapper[4658]: E1002 11:19:06.949182 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.038450 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.038493 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.038506 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.038521 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.038532 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.140815 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.140866 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.140877 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.140893 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.140904 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.160790 4658 generic.go:334] "Generic (PLEG): container finished" podID="894543ca-6e44-42e8-b41b-4578646d527f" containerID="db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc" exitCode=0 Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.160874 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerDied","Data":"db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.168844 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.175547 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.190896 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.219058 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.235128 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.246806 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.246833 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.246841 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.246853 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.246863 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.252908 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.270719 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.285426 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.303161 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.312650 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.324978 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.336400 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.356722 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.356766 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.356774 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.356789 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.356800 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.359704 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.372994 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.410082 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.445869 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.459098 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.459150 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.459164 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.459184 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.459196 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.563980 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.564033 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.564042 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.564060 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.564070 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.667878 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.667953 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.667966 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.667992 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.668007 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.770825 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.770869 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.770886 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.770909 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.770922 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.873714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.873808 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.873827 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.873860 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.873885 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.948609 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.948706 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:07 crc kubenswrapper[4658]: E1002 11:19:07.948825 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:07 crc kubenswrapper[4658]: E1002 11:19:07.948961 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.978382 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.978437 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.978450 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.978472 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:07 crc kubenswrapper[4658]: I1002 11:19:07.978486 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:07Z","lastTransitionTime":"2025-10-02T11:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.081766 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.081824 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.081841 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.081866 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.081882 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:08Z","lastTransitionTime":"2025-10-02T11:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.176983 4658 generic.go:334] "Generic (PLEG): container finished" podID="894543ca-6e44-42e8-b41b-4578646d527f" containerID="668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96" exitCode=0 Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.177054 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerDied","Data":"668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.184082 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.184165 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.184191 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.184225 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.184250 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:08Z","lastTransitionTime":"2025-10-02T11:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.205837 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.221141 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.237994 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.251183 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.264927 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.279465 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.286656 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.286695 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.286707 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.286727 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.286745 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:08Z","lastTransitionTime":"2025-10-02T11:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.293830 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.316921 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.353614 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.366934 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.379020 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.389264 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.389471 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.389554 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.389646 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.389719 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:08Z","lastTransitionTime":"2025-10-02T11:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.392072 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.404134 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.419674 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.432057 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.470186 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.470825 4658 scope.go:117] "RemoveContainer" containerID="f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3" Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.470963 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.492922 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.492951 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.492960 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.492972 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.492981 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:08Z","lastTransitionTime":"2025-10-02T11:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.568733 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.568943 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.568983 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.569026 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.569046 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569202 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569222 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569236 4658 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569311 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:16.569274159 +0000 UTC m=+37.460427726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569388 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:19:16.569381962 +0000 UTC m=+37.460535529 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569436 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569451 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569460 4658 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569481 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:16.569474955 +0000 UTC m=+37.460628512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569519 4658 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569540 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:16.569534657 +0000 UTC m=+37.460688224 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569584 4658 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.569605 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:16.569599619 +0000 UTC m=+37.460753186 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.595765 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.595831 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.595849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.595875 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.595895 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:08Z","lastTransitionTime":"2025-10-02T11:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.698700 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.698762 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.698780 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.698804 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.698821 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:08Z","lastTransitionTime":"2025-10-02T11:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.802271 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.802818 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.802878 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.802956 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.803020 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:08Z","lastTransitionTime":"2025-10-02T11:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.907880 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.907917 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.907927 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.907942 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.907951 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:08Z","lastTransitionTime":"2025-10-02T11:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:08 crc kubenswrapper[4658]: I1002 11:19:08.948589 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:08 crc kubenswrapper[4658]: E1002 11:19:08.948771 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.009839 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.009885 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.009903 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.009928 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.009946 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.112621 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.112687 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.112698 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.112719 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.112731 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.190097 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.190429 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.196201 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerStarted","Data":"27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.215799 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.215855 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.215871 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.215892 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.215908 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.218077 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.237689 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.256980 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.275276 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.279163 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.305878 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.319041 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.319095 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.319108 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.319128 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.319142 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.324423 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.342535 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.364455 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.387145 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.402730 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.422578 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.422996 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.423177 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.423436 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.423580 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.425908 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.442108 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.454254 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.469527 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.490585 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.503543 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.517221 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.526603 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.526636 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.526646 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.526663 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.526674 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.532316 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.542848 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.564946 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.578276 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.591915 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.605991 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.622889 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.629124 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.629169 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.629181 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.629197 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.629539 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.639641 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.654065 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.669444 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.683935 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.698652 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.730345 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.733369 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.733439 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.733463 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.733493 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.733515 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.836925 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.836993 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.837006 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.837026 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.837037 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.939998 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.940070 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.940092 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.940121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.940142 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:09Z","lastTransitionTime":"2025-10-02T11:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.948380 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.948442 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:09 crc kubenswrapper[4658]: E1002 11:19:09.948620 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:09 crc kubenswrapper[4658]: E1002 11:19:09.948816 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.966166 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:09 crc kubenswrapper[4658]: I1002 11:19:09.983018 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.004105 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.020638 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.036146 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.042511 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.042559 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.042569 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.042589 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.042604 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.058440 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.074878 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.098155 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.111705 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.130216 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.145660 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.145717 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.145729 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.145751 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.145765 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.150330 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.173929 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.186794 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.204696 4658 generic.go:334] "Generic (PLEG): container finished" podID="894543ca-6e44-42e8-b41b-4578646d527f" containerID="27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d" exitCode=0 Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.204786 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerDied","Data":"27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.204900 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.205358 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.206798 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.227234 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.242738 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.248859 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.248888 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.248898 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.248912 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.248924 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.257812 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.275036 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.291142 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.313473 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.338190 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.352194 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.356135 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.356186 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.356200 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.356225 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.356238 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.411659 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.435555 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.456748 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.457877 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.457915 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.457926 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.457945 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.457960 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.472061 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.488075 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.501340 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.515443 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.528086 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.544583 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.559079 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.560925 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.560970 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.560981 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.561001 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.561018 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.573989 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.589053 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.604835 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.648573 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.663897 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.663939 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.663948 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.663968 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.663979 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.685793 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.728618 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.768107 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.768184 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.768207 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.768240 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.768263 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.777039 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.812600 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.850782 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.876172 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.876365 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.876392 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.876452 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.876466 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.885952 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.928938 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.948703 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:10 crc kubenswrapper[4658]: E1002 11:19:10.948901 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.965897 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.981260 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.981340 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.981364 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.981387 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:10 crc kubenswrapper[4658]: I1002 11:19:10.981401 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:10Z","lastTransitionTime":"2025-10-02T11:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.011157 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.054747 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.083801 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.083829 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.083837 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.083851 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.083861 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:11Z","lastTransitionTime":"2025-10-02T11:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.186766 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.186805 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.186814 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.186828 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.186837 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:11Z","lastTransitionTime":"2025-10-02T11:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.211661 4658 generic.go:334] "Generic (PLEG): container finished" podID="894543ca-6e44-42e8-b41b-4578646d527f" containerID="079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554" exitCode=0 Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.211746 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerDied","Data":"079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.211818 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.239872 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.256348 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.268603 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.285086 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.289064 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.289089 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.289097 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.289112 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.289123 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:11Z","lastTransitionTime":"2025-10-02T11:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.295161 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.312169 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.325351 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.364484 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.391594 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.391642 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.391653 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.391670 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.391680 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:11Z","lastTransitionTime":"2025-10-02T11:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.405527 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.447374 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.485931 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.494467 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.494523 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.494534 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.494558 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.494570 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:11Z","lastTransitionTime":"2025-10-02T11:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.547431 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.565812 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.599228 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.599258 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.599270 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.599287 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.599313 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:11Z","lastTransitionTime":"2025-10-02T11:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.611378 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.659406 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:11Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.701373 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.701417 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.701430 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.701446 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.701458 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:11Z","lastTransitionTime":"2025-10-02T11:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.804253 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.804304 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.804315 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.804330 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.804339 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:11Z","lastTransitionTime":"2025-10-02T11:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.907167 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.907215 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.907227 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.907244 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.907259 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:11Z","lastTransitionTime":"2025-10-02T11:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.949149 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:11 crc kubenswrapper[4658]: I1002 11:19:11.949176 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:11 crc kubenswrapper[4658]: E1002 11:19:11.949409 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:11 crc kubenswrapper[4658]: E1002 11:19:11.949672 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.010281 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.010349 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.010359 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.010378 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.010390 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.113655 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.113686 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.113694 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.113708 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.113723 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.220182 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.220223 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.220235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.220251 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.220263 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.221933 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/0.log" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.225492 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0" exitCode=1 Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.225649 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.226777 4658 scope.go:117] "RemoveContainer" containerID="9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.229906 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" event={"ID":"894543ca-6e44-42e8-b41b-4578646d527f","Type":"ContainerStarted","Data":"3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.248828 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.266497 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.284784 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.303038 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.314721 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.321968 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.322090 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.322166 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.322225 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.322284 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.330936 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.347694 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.361425 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.382879 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.394709 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.407477 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.418707 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.424818 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.424844 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.424855 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.424884 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.424895 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.430865 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.438817 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.454753 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.465154 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.475803 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.489527 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.505176 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.527223 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.527268 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.527280 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.527321 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.527335 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.528601 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.546997 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.556873 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.572838 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.602785 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.635374 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.635420 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.635431 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.635448 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.635459 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.649134 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.691513 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.731394 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.738026 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.738067 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.738079 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.738106 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.738119 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.766995 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.812699 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.840958 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.841000 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.841016 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.841035 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.841047 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.855450 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:12Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.944324 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.944387 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.944404 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.944434 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.944452 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:12Z","lastTransitionTime":"2025-10-02T11:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:12 crc kubenswrapper[4658]: I1002 11:19:12.948531 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:12 crc kubenswrapper[4658]: E1002 11:19:12.948647 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.046881 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.046922 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.046931 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.046945 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.046955 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.150326 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.150376 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.150385 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.150399 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.150408 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.235327 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/0.log" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.238352 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.238482 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.251810 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.252938 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.252979 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.252993 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.253015 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.253028 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.264094 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.278770 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.293040 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.314175 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.328202 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.339867 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.352460 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.355478 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.355515 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.355528 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.355544 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.355555 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.363412 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.382928 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.402019 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.425063 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.440594 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.458346 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.458382 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.458392 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.458408 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.458420 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.459113 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.475089 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.561436 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.561494 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.561506 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.561531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.561543 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.664393 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.664474 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.664502 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.664536 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.664561 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.767256 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.767357 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.767379 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.767405 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.767423 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.871021 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.871095 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.871119 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.871147 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.871165 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.949050 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.949154 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:13 crc kubenswrapper[4658]: E1002 11:19:13.949271 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:13 crc kubenswrapper[4658]: E1002 11:19:13.949487 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.974000 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.974040 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.974049 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.974064 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:13 crc kubenswrapper[4658]: I1002 11:19:13.974075 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:13Z","lastTransitionTime":"2025-10-02T11:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.076270 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.076322 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.076336 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.076352 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.076366 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:14Z","lastTransitionTime":"2025-10-02T11:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.179653 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.179713 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.179733 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.179758 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.179776 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:14Z","lastTransitionTime":"2025-10-02T11:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.241650 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.283017 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.283069 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.283105 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.283144 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.283168 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:14Z","lastTransitionTime":"2025-10-02T11:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.385923 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.385974 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.385991 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.386014 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.386032 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:14Z","lastTransitionTime":"2025-10-02T11:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.489458 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.489535 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.489557 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.489584 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.489604 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:14Z","lastTransitionTime":"2025-10-02T11:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.592120 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.592165 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.592179 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.592197 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.592210 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:14Z","lastTransitionTime":"2025-10-02T11:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.694751 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.694821 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.694840 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.694866 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.694887 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:14Z","lastTransitionTime":"2025-10-02T11:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.797768 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.797835 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.797849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.797874 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.797890 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:14Z","lastTransitionTime":"2025-10-02T11:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.858186 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx"] Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.858986 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.861821 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.861965 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.881850 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.900643 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.900692 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.900711 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.900737 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.900755 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:14Z","lastTransitionTime":"2025-10-02T11:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.904663 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.925065 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.942354 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.947864 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrs78\" (UniqueName: \"kubernetes.io/projected/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-kube-api-access-nrs78\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.947963 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.948017 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.948120 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.948120 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:14 crc kubenswrapper[4658]: E1002 11:19:14.948279 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.959942 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.979551 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:14 crc kubenswrapper[4658]: I1002 11:19:14.995945 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.002674 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.002721 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.002733 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.002754 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.002768 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.027218 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:15Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.044915 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:15Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.049148 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.049207 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.049277 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.049385 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrs78\" (UniqueName: \"kubernetes.io/projected/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-kube-api-access-nrs78\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.049951 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.050775 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.062166 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.066257 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:15Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.077113 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrs78\" (UniqueName: \"kubernetes.io/projected/8f01b099-f45d-4f2e-8e0d-e2e8b36d9384-kube-api-access-nrs78\") pod \"ovnkube-control-plane-749d76644c-2bqqx\" (UID: \"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.084494 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:15Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.104607 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:15Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.105866 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.105911 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.105929 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.105950 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.105965 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.119672 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:15Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.134055 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:15Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.147667 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:15Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.162975 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:15Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.182396 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" Oct 02 11:19:15 crc kubenswrapper[4658]: W1002 11:19:15.205060 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f01b099_f45d_4f2e_8e0d_e2e8b36d9384.slice/crio-0389f88f1366588e38eaf117b323fc0df2cd9ecba305bd548b1b48547080e906 WatchSource:0}: Error finding container 0389f88f1366588e38eaf117b323fc0df2cd9ecba305bd548b1b48547080e906: Status 404 returned error can't find the container with id 0389f88f1366588e38eaf117b323fc0df2cd9ecba305bd548b1b48547080e906 Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.209559 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.209608 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.209627 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.209652 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.209668 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.247578 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" event={"ID":"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384","Type":"ContainerStarted","Data":"0389f88f1366588e38eaf117b323fc0df2cd9ecba305bd548b1b48547080e906"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.312457 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.312515 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.312530 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.312551 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.312569 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.415857 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.415897 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.415910 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.415928 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.416033 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.518747 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.518801 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.518813 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.518834 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.518849 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.621740 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.621964 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.621973 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.621987 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.621997 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.725140 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.725181 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.725192 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.725209 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.725220 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.828228 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.828272 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.828289 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.828354 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.828374 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.931374 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.931428 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.931440 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.931458 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.931471 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:15Z","lastTransitionTime":"2025-10-02T11:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.948996 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:15 crc kubenswrapper[4658]: I1002 11:19:15.949173 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:15 crc kubenswrapper[4658]: E1002 11:19:15.949464 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:15 crc kubenswrapper[4658]: E1002 11:19:15.949604 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.034336 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.034440 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.034462 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.034535 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.034553 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.136972 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.137029 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.137046 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.137070 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.137086 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.239132 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.239168 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.239178 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.239191 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.239200 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.254267 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" event={"ID":"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384","Type":"ContainerStarted","Data":"f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.254332 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" event={"ID":"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384","Type":"ContainerStarted","Data":"9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.269957 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.292100 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.317588 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.338234 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.341869 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.341917 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.341931 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.341945 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.341955 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.370735 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.384336 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6fxls"] Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.384882 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.384957 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.395450 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.407246 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.421927 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.439625 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.444685 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.444743 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.444763 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.444790 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.444813 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.450571 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.463627 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xq5b\" (UniqueName: \"kubernetes.io/projected/2ea83baf-570c-46db-ad98-aa9ec89d1c82-kube-api-access-8xq5b\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.463689 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.465597 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.479744 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.491396 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.509033 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.524658 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.548214 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.548264 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.548277 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.548319 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.548333 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.549640 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.564673 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.564864 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xq5b\" (UniqueName: \"kubernetes.io/projected/2ea83baf-570c-46db-ad98-aa9ec89d1c82-kube-api-access-8xq5b\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.564885 4658 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.564998 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs podName:2ea83baf-570c-46db-ad98-aa9ec89d1c82 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:17.06496932 +0000 UTC m=+37.956122917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs") pod "network-metrics-daemon-6fxls" (UID: "2ea83baf-570c-46db-ad98-aa9ec89d1c82") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.569477 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.590272 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.592570 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xq5b\" (UniqueName: \"kubernetes.io/projected/2ea83baf-570c-46db-ad98-aa9ec89d1c82-kube-api-access-8xq5b\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.612346 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.628399 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.645832 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.651167 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.651232 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.651252 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.651281 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.651329 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.665498 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.665623 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.665668 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.665770 4658 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.665805 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:19:32.665767869 +0000 UTC m=+53.556921476 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.665850 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:32.665834272 +0000 UTC m=+53.556987879 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.665946 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.666006 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.666035 4658 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.666026 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.666094 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.666160 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:32.66612115 +0000 UTC m=+53.557274887 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.666223 4658 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.666313 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:32.666274245 +0000 UTC m=+53.557428022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.666330 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.666354 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.666368 4658 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.666440 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:32.66642054 +0000 UTC m=+53.557574097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.667235 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.684605 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.702182 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.720267 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.736524 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.753922 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.753971 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.753989 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.754021 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.754037 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.759379 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.781513 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.794153 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.805487 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.819727 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.830454 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.851612 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.857052 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.857088 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.857098 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.857117 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.857131 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.948873 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:16 crc kubenswrapper[4658]: E1002 11:19:16.949080 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.959940 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.959992 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.960004 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.960022 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:16 crc kubenswrapper[4658]: I1002 11:19:16.960036 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:16Z","lastTransitionTime":"2025-10-02T11:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.063396 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.063464 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.063483 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.063508 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.063528 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.070616 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.070791 4658 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.070872 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs podName:2ea83baf-570c-46db-ad98-aa9ec89d1c82 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:18.070853364 +0000 UTC m=+38.962006941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs") pod "network-metrics-daemon-6fxls" (UID: "2ea83baf-570c-46db-ad98-aa9ec89d1c82") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.166876 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.166963 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.166987 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.167017 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.167041 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.197448 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.197508 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.197520 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.197545 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.197561 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.228330 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.234619 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.234684 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.234699 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.234721 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.234738 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.254627 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.260660 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.260694 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.260705 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.260722 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.260739 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.283957 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.289670 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.289934 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.290071 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.290203 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.290374 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.310264 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.315847 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.315912 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.315928 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.315949 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.315966 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.333526 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.333856 4658 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.336068 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.336159 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.336190 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.336224 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.336249 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.440520 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.441208 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.441465 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.441670 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.441819 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.544410 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.544480 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.544501 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.544525 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.544545 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.647902 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.648626 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.648772 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.648905 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.649030 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.752031 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.752400 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.752552 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.752698 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.752879 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.856637 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.856698 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.856721 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.856749 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.856769 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.948478 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.948715 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.948477 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.949227 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.949410 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:17 crc kubenswrapper[4658]: E1002 11:19:17.949621 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.958915 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.958983 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.959044 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.959072 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:17 crc kubenswrapper[4658]: I1002 11:19:17.959091 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:17Z","lastTransitionTime":"2025-10-02T11:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.062572 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.062660 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.062680 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.062726 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.062758 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.082424 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:18 crc kubenswrapper[4658]: E1002 11:19:18.082634 4658 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:18 crc kubenswrapper[4658]: E1002 11:19:18.082708 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs podName:2ea83baf-570c-46db-ad98-aa9ec89d1c82 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:20.082687605 +0000 UTC m=+40.973841172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs") pod "network-metrics-daemon-6fxls" (UID: "2ea83baf-570c-46db-ad98-aa9ec89d1c82") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.166157 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.166201 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.166209 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.166223 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.166233 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.168788 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.169043 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.196695 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" probeResult="failure" output="" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.212939 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" probeResult="failure" output="" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.269580 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.269663 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.269686 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.269711 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.269735 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.372963 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.373035 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.373052 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.373073 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.373088 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.477066 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.477120 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.477138 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.477157 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.477170 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.580367 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.580407 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.580416 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.580435 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.580445 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.683983 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.684160 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.684185 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.684221 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.684365 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.787247 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.787402 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.787425 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.787448 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.787464 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.890521 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.890573 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.890584 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.890602 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.890615 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.948845 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:18 crc kubenswrapper[4658]: E1002 11:19:18.949038 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.994143 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.994193 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.994211 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.994235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:18 crc kubenswrapper[4658]: I1002 11:19:18.994253 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:18Z","lastTransitionTime":"2025-10-02T11:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.096859 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.096916 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.096933 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.096955 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.096971 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:19Z","lastTransitionTime":"2025-10-02T11:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.199991 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.200041 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.200056 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.200074 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.200087 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:19Z","lastTransitionTime":"2025-10-02T11:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.303154 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.303677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.303825 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.303915 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.303983 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:19Z","lastTransitionTime":"2025-10-02T11:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.408123 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.408188 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.408211 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.408239 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.408257 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:19Z","lastTransitionTime":"2025-10-02T11:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.510938 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.510997 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.511014 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.511037 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.511057 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:19Z","lastTransitionTime":"2025-10-02T11:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.613915 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.613981 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.613999 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.614023 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.614041 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:19Z","lastTransitionTime":"2025-10-02T11:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.716886 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.716970 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.717012 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.717044 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.717065 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:19Z","lastTransitionTime":"2025-10-02T11:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.820827 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.820890 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.820914 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.820942 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.820968 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:19Z","lastTransitionTime":"2025-10-02T11:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.924400 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.924493 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.924519 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.924549 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.924575 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:19Z","lastTransitionTime":"2025-10-02T11:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.949239 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.949358 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:19 crc kubenswrapper[4658]: E1002 11:19:19.949518 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.949250 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:19 crc kubenswrapper[4658]: E1002 11:19:19.949735 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:19 crc kubenswrapper[4658]: E1002 11:19:19.950083 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.980174 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:19 crc kubenswrapper[4658]: I1002 11:19:19.996401 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.019963 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.028146 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.028216 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.028250 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.028285 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.028341 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.035874 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.056578 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.075814 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.091411 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.117418 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:20 crc kubenswrapper[4658]: E1002 11:19:20.117749 4658 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:20 crc kubenswrapper[4658]: E1002 11:19:20.117889 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs podName:2ea83baf-570c-46db-ad98-aa9ec89d1c82 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:24.117863792 +0000 UTC m=+45.009017359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs") pod "network-metrics-daemon-6fxls" (UID: "2ea83baf-570c-46db-ad98-aa9ec89d1c82") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.120344 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.130437 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.130489 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.130502 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.130523 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.130541 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.138699 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.154455 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.172225 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.195980 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.210848 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.234600 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.234654 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.234676 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.234700 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.234717 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.245833 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.270256 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.293167 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.313828 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.337941 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.338016 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.338035 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.338061 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.338080 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.441937 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.441992 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.442013 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.442037 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.442057 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.544905 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.544971 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.544989 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.545013 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.545029 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.647901 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.647950 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.647959 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.647976 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.647986 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.750685 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.750728 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.750739 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.750757 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.750768 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.853662 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.853694 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.853702 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.853715 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.853725 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.948179 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:20 crc kubenswrapper[4658]: E1002 11:19:20.948663 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.948950 4658 scope.go:117] "RemoveContainer" containerID="f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.955867 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.955920 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.956146 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.956163 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:20 crc kubenswrapper[4658]: I1002 11:19:20.956178 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:20Z","lastTransitionTime":"2025-10-02T11:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.059024 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.059063 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.059072 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.059086 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.059094 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.161702 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.161754 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.161767 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.161787 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.161801 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.264758 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.264831 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.264849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.264875 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.264918 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.274751 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.276930 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.277479 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.298184 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.316577 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.335859 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.350555 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.367695 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.367745 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.367762 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.367785 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.367801 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.371894 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.386473 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.405045 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.423078 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.440075 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.458452 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.470679 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.470737 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.470754 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.470778 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.470795 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.474654 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.499052 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.518329 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.532285 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.548090 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.563235 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.574048 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.574095 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.574114 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.574137 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.574155 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.584609 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:21Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.677126 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.677170 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.677186 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.677209 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.677248 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.780037 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.780109 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.780129 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.780156 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.780174 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.882211 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.882336 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.882362 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.882385 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.882401 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.948411 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.948420 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:21 crc kubenswrapper[4658]: E1002 11:19:21.948690 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.948442 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:21 crc kubenswrapper[4658]: E1002 11:19:21.948782 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:21 crc kubenswrapper[4658]: E1002 11:19:21.948891 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.985705 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.985756 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.985787 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.985803 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:21 crc kubenswrapper[4658]: I1002 11:19:21.985813 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:21Z","lastTransitionTime":"2025-10-02T11:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.088865 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.088916 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.088927 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.088945 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.088957 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:22Z","lastTransitionTime":"2025-10-02T11:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.191223 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.191266 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.191277 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.191313 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.191326 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:22Z","lastTransitionTime":"2025-10-02T11:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.293597 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.293657 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.293677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.293701 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.293720 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:22Z","lastTransitionTime":"2025-10-02T11:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.396492 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.396549 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.396570 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.396595 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.396614 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:22Z","lastTransitionTime":"2025-10-02T11:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.499653 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.499693 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.499705 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.499723 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.499736 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:22Z","lastTransitionTime":"2025-10-02T11:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.601912 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.601987 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.602009 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.602063 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.602082 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:22Z","lastTransitionTime":"2025-10-02T11:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.704617 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.704762 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.704784 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.704808 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.704824 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:22Z","lastTransitionTime":"2025-10-02T11:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.807623 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.807693 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.807715 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.807741 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.807759 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:22Z","lastTransitionTime":"2025-10-02T11:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.910712 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.910756 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.910766 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.910783 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.910795 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:22Z","lastTransitionTime":"2025-10-02T11:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:22 crc kubenswrapper[4658]: I1002 11:19:22.948320 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:22 crc kubenswrapper[4658]: E1002 11:19:22.948458 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.013775 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.013833 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.013845 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.013864 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.013876 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.116483 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.116520 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.116533 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.116548 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.116560 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.219180 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.219249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.219267 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.219318 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.219336 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.321807 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.321977 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.322013 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.322043 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.322064 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.425167 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.425227 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.425268 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.425324 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.425349 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.528951 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.529033 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.529053 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.529078 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.529096 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.632162 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.632209 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.632233 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.632259 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.632275 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.736143 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.736194 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.736205 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.736221 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.736231 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.839631 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.839688 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.839706 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.839728 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.839745 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.943065 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.943114 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.943132 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.943161 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.943175 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:23Z","lastTransitionTime":"2025-10-02T11:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.948556 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.948659 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:23 crc kubenswrapper[4658]: E1002 11:19:23.948708 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:23 crc kubenswrapper[4658]: I1002 11:19:23.948567 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:23 crc kubenswrapper[4658]: E1002 11:19:23.948971 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:23 crc kubenswrapper[4658]: E1002 11:19:23.949076 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.045938 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.045978 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.045987 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.046028 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.046040 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.148608 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.148682 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.148705 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.148732 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.148771 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.180251 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:24 crc kubenswrapper[4658]: E1002 11:19:24.180501 4658 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:24 crc kubenswrapper[4658]: E1002 11:19:24.180620 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs podName:2ea83baf-570c-46db-ad98-aa9ec89d1c82 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:32.180586118 +0000 UTC m=+53.071739715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs") pod "network-metrics-daemon-6fxls" (UID: "2ea83baf-570c-46db-ad98-aa9ec89d1c82") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.251879 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.251927 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.251948 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.251977 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.251999 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.354682 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.354735 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.354752 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.354774 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.354792 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.457903 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.457967 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.457989 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.458017 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.458037 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.561175 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.561247 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.561271 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.561342 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.561368 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.664781 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.664828 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.664842 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.664859 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.664871 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.767813 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.767850 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.767861 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.767878 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.767887 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.870649 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.870707 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.870725 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.870749 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.870765 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.948440 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:24 crc kubenswrapper[4658]: E1002 11:19:24.948635 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.973380 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.973452 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.973477 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.973509 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:24 crc kubenswrapper[4658]: I1002 11:19:24.973531 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:24Z","lastTransitionTime":"2025-10-02T11:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.076576 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.076637 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.076654 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.076676 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.076693 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:25Z","lastTransitionTime":"2025-10-02T11:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.181713 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.181760 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.181777 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.181800 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.181817 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:25Z","lastTransitionTime":"2025-10-02T11:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.284922 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.284985 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.285003 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.285027 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.285042 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:25Z","lastTransitionTime":"2025-10-02T11:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.388118 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.388175 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.388191 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.388215 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.388232 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:25Z","lastTransitionTime":"2025-10-02T11:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.491322 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.491367 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.491377 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.491391 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.491400 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:25Z","lastTransitionTime":"2025-10-02T11:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.593919 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.593976 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.593991 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.594010 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.594024 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:25Z","lastTransitionTime":"2025-10-02T11:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.696384 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.696426 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.696435 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.696449 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.696459 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:25Z","lastTransitionTime":"2025-10-02T11:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.799175 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.799253 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.799277 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.799334 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.799355 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:25Z","lastTransitionTime":"2025-10-02T11:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.902038 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.902108 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.902125 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.902151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.902167 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:25Z","lastTransitionTime":"2025-10-02T11:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.948628 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.948745 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:25 crc kubenswrapper[4658]: E1002 11:19:25.948846 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:25 crc kubenswrapper[4658]: I1002 11:19:25.948915 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:25 crc kubenswrapper[4658]: E1002 11:19:25.949116 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:25 crc kubenswrapper[4658]: E1002 11:19:25.949324 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.005609 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.005666 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.005683 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.005708 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.005728 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.109170 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.109229 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.109249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.109274 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.109324 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.212451 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.212522 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.212533 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.212552 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.212564 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.315020 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.315066 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.315077 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.315092 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.315102 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.418532 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.418600 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.418618 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.418645 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.418665 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.520439 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.520524 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.520558 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.520598 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.520620 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.623771 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.623810 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.623819 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.623832 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.623841 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.726766 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.726818 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.726835 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.726858 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.726875 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.832420 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.832474 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.832510 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.832531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.832544 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.935244 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.935341 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.935360 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.935384 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.935403 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:26Z","lastTransitionTime":"2025-10-02T11:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:26 crc kubenswrapper[4658]: I1002 11:19:26.948680 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:26 crc kubenswrapper[4658]: E1002 11:19:26.948945 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.038809 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.038848 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.038860 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.038876 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.038888 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.142681 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.142760 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.142779 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.142808 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.142825 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.244829 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.244876 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.244972 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.245018 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.245027 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.348147 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.348224 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.348245 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.348270 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.348326 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.451874 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.451935 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.451947 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.451963 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.451975 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.522636 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.522684 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.522698 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.522714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.522727 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: E1002 11:19:27.542054 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.547006 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.547076 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.547096 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.547123 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.547141 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: E1002 11:19:27.563654 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.567530 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.567591 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.567616 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.567645 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.567666 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: E1002 11:19:27.588619 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.592767 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.592832 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.592845 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.592863 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.592887 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: E1002 11:19:27.610980 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.615760 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.615813 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.615829 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.615849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.615864 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: E1002 11:19:27.631645 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:27 crc kubenswrapper[4658]: E1002 11:19:27.631962 4658 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.633918 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.633967 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.633985 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.634011 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.634069 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.736791 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.736846 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.736867 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.736890 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.736906 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.839873 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.839941 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.839958 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.839981 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.839998 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.942754 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.942802 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.942815 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.942832 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.942844 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:27Z","lastTransitionTime":"2025-10-02T11:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.949006 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:27 crc kubenswrapper[4658]: E1002 11:19:27.949110 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.949419 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:27 crc kubenswrapper[4658]: E1002 11:19:27.949500 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:27 crc kubenswrapper[4658]: I1002 11:19:27.949578 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:27 crc kubenswrapper[4658]: E1002 11:19:27.949712 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.046039 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.046117 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.046130 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.046175 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.046189 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.148103 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.148172 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.148190 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.148216 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.148233 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.251560 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.251621 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.251639 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.251663 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.251680 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.353909 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.354006 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.354059 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.354084 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.354100 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.457646 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.457725 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.457747 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.457777 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.457802 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.560637 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.560712 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.560736 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.560764 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.560785 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.663667 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.663747 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.663770 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.663796 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.663818 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.766743 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.766786 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.766799 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.766817 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.766829 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.870534 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.870708 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.870736 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.870766 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.870786 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.948925 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:28 crc kubenswrapper[4658]: E1002 11:19:28.949136 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.979966 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.980079 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.980138 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.980169 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:28 crc kubenswrapper[4658]: I1002 11:19:28.980225 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:28Z","lastTransitionTime":"2025-10-02T11:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.083116 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.083173 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.083194 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.083223 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.083246 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:29Z","lastTransitionTime":"2025-10-02T11:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.185401 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.185442 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.185451 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.185466 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.185478 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:29Z","lastTransitionTime":"2025-10-02T11:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.288851 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.288892 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.288902 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.288917 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.288928 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:29Z","lastTransitionTime":"2025-10-02T11:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.391870 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.391912 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.391923 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.391939 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.391950 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:29Z","lastTransitionTime":"2025-10-02T11:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.494882 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.494939 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.494949 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.494965 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.494974 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:29Z","lastTransitionTime":"2025-10-02T11:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.597321 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.597368 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.597377 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.597391 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.597401 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:29Z","lastTransitionTime":"2025-10-02T11:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.700121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.700211 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.700232 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.700265 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.700290 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:29Z","lastTransitionTime":"2025-10-02T11:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.803595 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.803672 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.803686 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.803704 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.803717 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:29Z","lastTransitionTime":"2025-10-02T11:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.906581 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.906633 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.906643 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.906660 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.906670 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:29Z","lastTransitionTime":"2025-10-02T11:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.948929 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:29 crc kubenswrapper[4658]: E1002 11:19:29.949091 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.949382 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:29 crc kubenswrapper[4658]: E1002 11:19:29.949588 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.949686 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:29 crc kubenswrapper[4658]: E1002 11:19:29.949792 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.966770 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.981897 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:29 crc kubenswrapper[4658]: I1002 11:19:29.999266 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.012778 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.012849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.012865 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.012886 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.012907 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.016853 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.032906 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.052242 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.069538 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.084534 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.103207 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.115061 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.115117 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.115138 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.115161 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.115176 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.120924 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.130540 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.138568 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.150183 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.162981 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.175484 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.189050 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.205853 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.218118 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.218162 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.218180 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.218203 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.218221 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.321010 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.321087 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.321102 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.321122 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.321135 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.423834 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.423878 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.423889 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.423907 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.423918 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.526205 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.526261 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.526276 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.526326 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.526343 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.629668 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.629723 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.629737 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.629756 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.629771 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.733202 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.733270 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.733327 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.733360 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.733386 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.836230 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.836284 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.836333 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.836362 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.836387 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.939460 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.939535 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.939576 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.939610 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.939634 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:30Z","lastTransitionTime":"2025-10-02T11:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:30 crc kubenswrapper[4658]: I1002 11:19:30.949122 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:30 crc kubenswrapper[4658]: E1002 11:19:30.949362 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.042587 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.042651 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.042664 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.042683 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.042695 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.146039 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.146093 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.146103 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.146133 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.146146 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.249622 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.250002 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.250186 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.250578 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.250725 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.353271 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.353802 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.353952 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.354096 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.354240 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.457227 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.457350 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.457377 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.457407 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.457425 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.560600 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.560653 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.560677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.560700 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.560714 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.663727 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.663786 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.663809 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.663835 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.663855 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.766273 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.766379 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.766398 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.766421 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.766439 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.869337 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.869403 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.869429 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.869461 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.869484 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.949101 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.949152 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.949196 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:31 crc kubenswrapper[4658]: E1002 11:19:31.949268 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:31 crc kubenswrapper[4658]: E1002 11:19:31.949693 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:31 crc kubenswrapper[4658]: E1002 11:19:31.949764 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.972159 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.972208 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.972224 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.972243 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:31 crc kubenswrapper[4658]: I1002 11:19:31.972257 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:31Z","lastTransitionTime":"2025-10-02T11:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.074536 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.074572 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.074584 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.074599 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.074610 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:32Z","lastTransitionTime":"2025-10-02T11:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.178973 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.179017 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.179030 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.179048 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.179061 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:32Z","lastTransitionTime":"2025-10-02T11:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.275517 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.275590 4658 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.275660 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs podName:2ea83baf-570c-46db-ad98-aa9ec89d1c82 nodeName:}" failed. No retries permitted until 2025-10-02 11:19:48.275642235 +0000 UTC m=+69.166795802 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs") pod "network-metrics-daemon-6fxls" (UID: "2ea83baf-570c-46db-ad98-aa9ec89d1c82") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.281360 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.281392 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.281403 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.281421 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.281432 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:32Z","lastTransitionTime":"2025-10-02T11:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.384888 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.384959 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.384984 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.385013 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.385033 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:32Z","lastTransitionTime":"2025-10-02T11:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.488381 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.488449 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.488472 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.488503 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.488528 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:32Z","lastTransitionTime":"2025-10-02T11:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.591771 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.591841 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.591858 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.591884 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.591903 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:32Z","lastTransitionTime":"2025-10-02T11:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.680032 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.680173 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.680213 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.680255 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.680337 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680426 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:20:04.68037996 +0000 UTC m=+85.571533587 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680455 4658 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680505 4658 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680529 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:20:04.680507833 +0000 UTC m=+85.571661430 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680634 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680667 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:20:04.680635347 +0000 UTC m=+85.571788954 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680670 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680520 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680780 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680697 4658 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680823 4658 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680944 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:20:04.680913366 +0000 UTC m=+85.572067073 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.680991 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:20:04.680971328 +0000 UTC m=+85.572124965 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.694366 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.694436 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.694461 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.694493 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.694515 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:32Z","lastTransitionTime":"2025-10-02T11:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.798004 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.798052 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.798062 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.798079 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.798097 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:32Z","lastTransitionTime":"2025-10-02T11:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.900934 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.901001 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.901021 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.901047 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.901068 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:32Z","lastTransitionTime":"2025-10-02T11:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:32 crc kubenswrapper[4658]: I1002 11:19:32.948934 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:32 crc kubenswrapper[4658]: E1002 11:19:32.949121 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.004142 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.004211 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.004231 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.004257 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.004291 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.107470 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.107522 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.107530 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.107548 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.107558 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.211672 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.211727 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.211766 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.211789 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.211808 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.315375 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.315935 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.315945 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.315962 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.315978 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.419746 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.419862 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.419889 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.419920 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.419943 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.523565 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.523642 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.523676 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.523706 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.523730 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.627209 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.627264 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.627276 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.627320 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.627337 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.730591 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.730653 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.730670 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.730693 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.730711 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.833555 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.833592 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.833604 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.833617 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.833628 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.937249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.937322 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.937339 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.937360 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.937376 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:33Z","lastTransitionTime":"2025-10-02T11:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.948516 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.948555 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:33 crc kubenswrapper[4658]: I1002 11:19:33.948598 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:33 crc kubenswrapper[4658]: E1002 11:19:33.948618 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:33 crc kubenswrapper[4658]: E1002 11:19:33.948786 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:33 crc kubenswrapper[4658]: E1002 11:19:33.948887 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.039930 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.039980 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.040041 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.040062 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.040081 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.143102 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.143140 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.143151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.143168 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.143179 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.247066 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.247150 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.247176 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.247202 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.247225 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.328188 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/1.log" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.329228 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/0.log" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.334104 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2" exitCode=1 Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.334193 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.334500 4658 scope.go:117] "RemoveContainer" containerID="9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.335818 4658 scope.go:117] "RemoveContainer" containerID="e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2" Oct 02 11:19:34 crc kubenswrapper[4658]: E1002 11:19:34.336179 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.349504 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.349548 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.349561 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.349577 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.349591 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.363242 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.383871 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.399756 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.417223 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.440957 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:33Z\\\",\\\"message\\\":\\\"encies-checksum:sha256:f0beb9378fd30968608b370e3877d4c76f7539f11a5eebf44bee42a8b2dd7068 openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207264 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz: failed to check if pod openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207288 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz: failed to check if pod openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz is in primary UDN: could not find OVN pod annotation in map[]\\\\nE1002 11:19:33.292342 6084 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1002 11:19:33.293573 6084 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:19:33.293637 6084 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.452063 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.452140 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.452160 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.452183 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.452197 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.465873 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.483596 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.496913 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.515939 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.530052 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.546013 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.555623 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.555693 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.555713 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.555741 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.555759 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.562969 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.576608 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.592984 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.608201 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.628503 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.646090 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.658788 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.658853 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.658877 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.658908 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.658933 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.761973 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.762039 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.762061 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.762090 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.762111 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.859231 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.865259 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.865328 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.865345 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.865364 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.865379 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.874437 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.882445 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.901492 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.918156 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.931329 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.946666 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.948827 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:34 crc kubenswrapper[4658]: E1002 11:19:34.948988 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.961863 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.968496 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.968531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.968541 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.968555 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.968565 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:34Z","lastTransitionTime":"2025-10-02T11:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.980221 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:34 crc kubenswrapper[4658]: I1002 11:19:34.994412 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:34Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.007520 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.020061 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.035635 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.062554 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:33Z\\\",\\\"message\\\":\\\"encies-checksum:sha256:f0beb9378fd30968608b370e3877d4c76f7539f11a5eebf44bee42a8b2dd7068 openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207264 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz: failed to check if pod openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207288 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz: failed to check if pod openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz is in primary UDN: could not find OVN pod annotation in map[]\\\\nE1002 11:19:33.292342 6084 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1002 11:19:33.293573 6084 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:19:33.293637 6084 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.070558 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.070590 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.070599 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.070612 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.070621 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:35Z","lastTransitionTime":"2025-10-02T11:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.077285 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.090079 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.105331 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.117187 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.141108 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.173398 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.173442 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.173451 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.173465 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.173477 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:35Z","lastTransitionTime":"2025-10-02T11:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.276058 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.276513 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.276667 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.276865 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.277061 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:35Z","lastTransitionTime":"2025-10-02T11:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.341146 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/1.log" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.380713 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.380778 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.380794 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.380821 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.380838 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:35Z","lastTransitionTime":"2025-10-02T11:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.484142 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.484202 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.484220 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.484243 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.484260 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:35Z","lastTransitionTime":"2025-10-02T11:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.587798 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.587884 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.587909 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.587959 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.587986 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:35Z","lastTransitionTime":"2025-10-02T11:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.691447 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.691520 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.691537 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.691559 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.691576 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:35Z","lastTransitionTime":"2025-10-02T11:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.795016 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.795100 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.795127 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.795164 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.795192 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:35Z","lastTransitionTime":"2025-10-02T11:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.899501 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.899559 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.899573 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.899593 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.899608 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:35Z","lastTransitionTime":"2025-10-02T11:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.948455 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.948476 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:35 crc kubenswrapper[4658]: I1002 11:19:35.948700 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:35 crc kubenswrapper[4658]: E1002 11:19:35.948734 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:35 crc kubenswrapper[4658]: E1002 11:19:35.948988 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:35 crc kubenswrapper[4658]: E1002 11:19:35.949176 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.002993 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.003424 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.003839 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.004209 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.004394 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.108144 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.109094 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.109319 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.109532 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.109750 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.213569 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.213632 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.213649 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.213674 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.213691 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.316224 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.316575 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.316754 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.316949 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.317089 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.421228 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.421347 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.421373 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.421403 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.421428 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.524551 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.524687 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.524713 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.524774 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.524797 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.628040 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.628104 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.628122 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.628146 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.628163 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.735943 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.736008 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.736027 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.736112 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.736141 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.780873 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.805842 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.826718 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.839901 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.840000 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.840023 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.840090 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.840109 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.843078 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.863454 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.877594 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.898409 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.914959 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.940368 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:33Z\\\",\\\"message\\\":\\\"encies-checksum:sha256:f0beb9378fd30968608b370e3877d4c76f7539f11a5eebf44bee42a8b2dd7068 openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207264 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz: failed to check if pod openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207288 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz: failed to check if pod openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz is in primary UDN: could not find OVN pod annotation in map[]\\\\nE1002 11:19:33.292342 6084 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1002 11:19:33.293573 6084 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:19:33.293637 6084 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.943752 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.943797 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.943811 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.943830 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.943843 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:36Z","lastTransitionTime":"2025-10-02T11:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.948845 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:36 crc kubenswrapper[4658]: E1002 11:19:36.949018 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.962441 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.977884 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:36 crc kubenswrapper[4658]: I1002 11:19:36.992082 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.008074 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.021062 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.045153 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.047199 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.047414 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.047439 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.047516 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.047536 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.061276 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.074247 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.086709 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.097582 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.150031 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.150096 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.150105 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.150121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.150131 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.252579 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.252635 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.252645 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.252659 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.252668 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.356167 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.356224 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.356241 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.356263 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.356280 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.459641 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.459704 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.459720 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.459743 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.459762 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.563399 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.563472 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.563496 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.563525 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.563546 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.666584 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.666666 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.666684 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.666712 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.666732 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.770616 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.770675 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.770691 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.770714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.770744 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.822995 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.823044 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.823057 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.823074 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.823087 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: E1002 11:19:37.843855 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.848875 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.848937 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.848958 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.848988 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.849011 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: E1002 11:19:37.868829 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.879803 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.879863 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.879888 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.879917 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.879940 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: E1002 11:19:37.902374 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.907682 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.907755 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.907780 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.907812 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.907836 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: E1002 11:19:37.928133 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.932279 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.932393 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.932419 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.932453 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.932476 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.948542 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.948684 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:37 crc kubenswrapper[4658]: E1002 11:19:37.948717 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:37 crc kubenswrapper[4658]: E1002 11:19:37.948917 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.949258 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:37 crc kubenswrapper[4658]: E1002 11:19:37.949548 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:37 crc kubenswrapper[4658]: E1002 11:19:37.953491 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:37 crc kubenswrapper[4658]: E1002 11:19:37.953745 4658 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.956550 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.956591 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.956602 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.956621 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:37 crc kubenswrapper[4658]: I1002 11:19:37.956634 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:37Z","lastTransitionTime":"2025-10-02T11:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.059850 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.059911 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.059928 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.059952 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.059972 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.163390 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.163453 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.163469 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.163494 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.163511 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.266856 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.266952 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.266975 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.267000 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.267060 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.371100 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.371159 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.371180 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.371207 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.371227 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.474363 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.474447 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.474469 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.474506 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.474530 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.578416 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.578463 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.578471 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.578489 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.578498 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.681185 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.681323 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.681340 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.681357 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.681370 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.784231 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.784322 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.784342 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.784364 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.784381 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.887882 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.887960 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.887983 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.888013 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.888035 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.948602 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:38 crc kubenswrapper[4658]: E1002 11:19:38.948928 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.991057 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.991114 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.991124 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.991141 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:38 crc kubenswrapper[4658]: I1002 11:19:38.991152 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:38Z","lastTransitionTime":"2025-10-02T11:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.094002 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.094075 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.094091 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.094115 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.094134 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:39Z","lastTransitionTime":"2025-10-02T11:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.197711 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.197822 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.197855 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.197885 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.197907 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:39Z","lastTransitionTime":"2025-10-02T11:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.301538 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.301663 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.301677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.301697 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.301708 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:39Z","lastTransitionTime":"2025-10-02T11:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.405831 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.405968 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.405987 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.406029 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.406088 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:39Z","lastTransitionTime":"2025-10-02T11:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.509619 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.509715 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.509735 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.509760 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.509780 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:39Z","lastTransitionTime":"2025-10-02T11:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.613133 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.613196 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.613211 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.613239 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.613258 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:39Z","lastTransitionTime":"2025-10-02T11:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.716709 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.716778 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.716800 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.716828 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.716847 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:39Z","lastTransitionTime":"2025-10-02T11:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.819686 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.819822 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.819843 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.819868 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.819886 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:39Z","lastTransitionTime":"2025-10-02T11:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.924468 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.924536 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.924553 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.924575 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.924594 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:39Z","lastTransitionTime":"2025-10-02T11:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.948136 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.948180 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.948190 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:39 crc kubenswrapper[4658]: E1002 11:19:39.948408 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:39 crc kubenswrapper[4658]: E1002 11:19:39.948677 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:39 crc kubenswrapper[4658]: E1002 11:19:39.948844 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.963762 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:39 crc kubenswrapper[4658]: I1002 11:19:39.981121 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.001463 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.019140 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.028550 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.028620 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.028641 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.028675 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.028698 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.033792 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.055466 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.072722 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.099690 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.114931 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.128861 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.132801 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.132849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.132861 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.132880 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.132893 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.156188 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3cea5238a37152715f49f6517cf3a8f73d2fe3e7aabdc0aa7230587bdbf7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"message\\\":\\\" (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 11:19:11.736835 5903 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737060 5903 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737117 5903 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:19:11.737356 5903 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:19:11.737930 5903 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:19:11.738007 5903 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:19:11.738039 5903 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:19:11.738065 5903 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:19:11.738077 5903 factory.go:656] Stopping watch factory\\\\nI1002 11:19:11.738104 5903 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:19:11.738081 5903 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:33Z\\\",\\\"message\\\":\\\"encies-checksum:sha256:f0beb9378fd30968608b370e3877d4c76f7539f11a5eebf44bee42a8b2dd7068 openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207264 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz: failed to check if pod openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207288 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz: failed to check if pod openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz is in primary UDN: could not find OVN pod annotation in map[]\\\\nE1002 11:19:33.292342 6084 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1002 11:19:33.293573 6084 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:19:33.293637 6084 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.180109 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.198526 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.212542 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.230137 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.235150 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.235194 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.235211 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.235228 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.235239 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.243068 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.274504 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.285611 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.338395 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.338452 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.338466 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.338484 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.338496 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.441285 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.441348 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.441359 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.441375 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.441384 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.543715 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.543757 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.543767 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.543781 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.543790 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.646426 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.646480 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.646490 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.646505 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.646517 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.749722 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.749782 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.749799 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.749823 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.749843 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.852475 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.852522 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.852538 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.852563 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.852580 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.948462 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:40 crc kubenswrapper[4658]: E1002 11:19:40.948576 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.954654 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.954725 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.954743 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.954768 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:40 crc kubenswrapper[4658]: I1002 11:19:40.954786 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:40Z","lastTransitionTime":"2025-10-02T11:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.056525 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.056560 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.056568 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.056581 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.056591 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.159110 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.159174 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.159190 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.159213 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.159229 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.262869 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.263008 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.263048 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.263079 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.263102 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.365825 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.365865 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.365875 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.365908 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.365921 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.468850 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.468901 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.468918 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.468939 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.468955 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.571730 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.571778 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.571795 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.571818 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.571835 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.673799 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.673846 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.673858 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.673875 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.673886 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.776900 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.776964 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.776980 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.777004 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.777020 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.881061 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.881124 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.881142 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.881166 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.881182 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.948993 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.948993 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:41 crc kubenswrapper[4658]: E1002 11:19:41.949229 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.949399 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:41 crc kubenswrapper[4658]: E1002 11:19:41.949644 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:41 crc kubenswrapper[4658]: E1002 11:19:41.949830 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.984434 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.984487 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.984499 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.984515 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:41 crc kubenswrapper[4658]: I1002 11:19:41.984527 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:41Z","lastTransitionTime":"2025-10-02T11:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.087471 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.087550 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.087571 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.087600 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.087625 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:42Z","lastTransitionTime":"2025-10-02T11:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.190575 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.190708 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.190762 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.190816 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.190851 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:42Z","lastTransitionTime":"2025-10-02T11:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.293943 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.293995 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.294007 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.294025 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.294040 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:42Z","lastTransitionTime":"2025-10-02T11:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.397468 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.397547 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.397570 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.397602 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.397624 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:42Z","lastTransitionTime":"2025-10-02T11:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.500994 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.501061 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.501078 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.501101 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.501120 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:42Z","lastTransitionTime":"2025-10-02T11:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.604821 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.604943 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.604965 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.604990 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.605008 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:42Z","lastTransitionTime":"2025-10-02T11:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.709204 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.709256 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.709273 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.709344 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.709368 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:42Z","lastTransitionTime":"2025-10-02T11:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.811581 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.811658 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.811681 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.811711 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.811733 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:42Z","lastTransitionTime":"2025-10-02T11:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.914512 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.914575 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.914599 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.914627 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.914648 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:42Z","lastTransitionTime":"2025-10-02T11:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:42 crc kubenswrapper[4658]: I1002 11:19:42.948532 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:42 crc kubenswrapper[4658]: E1002 11:19:42.948731 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.016932 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.017006 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.017030 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.017059 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.017077 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.120517 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.120578 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.120595 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.120618 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.120634 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.223916 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.223978 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.223995 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.224019 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.224036 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.326711 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.326762 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.326780 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.326840 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.326857 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.429621 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.429677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.429692 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.429710 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.429721 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.532686 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.532725 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.532735 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.532750 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.532761 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.635695 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.635765 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.635787 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.635818 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.635847 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.737859 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.737905 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.737914 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.737929 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.737940 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.840703 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.840748 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.840757 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.840771 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.840786 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.943601 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.943644 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.943652 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.943665 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.943675 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:43Z","lastTransitionTime":"2025-10-02T11:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.948863 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:43 crc kubenswrapper[4658]: E1002 11:19:43.948992 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.948870 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:43 crc kubenswrapper[4658]: I1002 11:19:43.949100 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:43 crc kubenswrapper[4658]: E1002 11:19:43.949149 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:43 crc kubenswrapper[4658]: E1002 11:19:43.949360 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.046518 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.046554 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.046563 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.046576 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.046584 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.149238 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.149336 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.149403 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.149435 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.149456 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.252831 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.252863 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.252872 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.252886 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.252895 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.355767 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.355840 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.355850 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.355863 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.355872 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.458676 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.458733 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.458747 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.458766 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.458776 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.561328 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.561370 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.561380 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.561397 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.561411 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.663250 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.663288 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.663309 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.663324 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.663333 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.765921 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.765984 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.765997 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.766016 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.766029 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.868664 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.868718 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.868732 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.868757 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.868771 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.948469 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:44 crc kubenswrapper[4658]: E1002 11:19:44.948626 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.971042 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.971119 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.971146 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.971177 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:44 crc kubenswrapper[4658]: I1002 11:19:44.971198 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:44Z","lastTransitionTime":"2025-10-02T11:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.074182 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.074224 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.074235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.074252 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.074262 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.176932 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.176986 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.176999 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.177018 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.177031 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.279616 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.279681 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.279693 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.279707 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.279716 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.381232 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.381327 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.381336 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.381352 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.381361 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.483270 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.483326 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.483339 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.483355 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.483367 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.586079 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.586128 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.586144 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.586160 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.586174 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.688404 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.688457 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.688468 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.688500 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.688510 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.790531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.790581 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.790600 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.790618 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.790634 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.893406 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.893469 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.893486 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.893510 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.893526 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.948146 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.948193 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.948247 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:45 crc kubenswrapper[4658]: E1002 11:19:45.948308 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:45 crc kubenswrapper[4658]: E1002 11:19:45.948723 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:45 crc kubenswrapper[4658]: E1002 11:19:45.948858 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.995796 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.995861 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.995875 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.995894 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:45 crc kubenswrapper[4658]: I1002 11:19:45.995907 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:45Z","lastTransitionTime":"2025-10-02T11:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.098227 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.098274 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.098283 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.098314 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.098324 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:46Z","lastTransitionTime":"2025-10-02T11:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.200835 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.200868 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.200879 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.200893 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.200903 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:46Z","lastTransitionTime":"2025-10-02T11:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.303031 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.303086 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.303097 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.303116 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.303140 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:46Z","lastTransitionTime":"2025-10-02T11:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.405537 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.405579 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.405588 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.405602 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.405612 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:46Z","lastTransitionTime":"2025-10-02T11:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.507925 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.507990 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.508003 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.508018 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.508028 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:46Z","lastTransitionTime":"2025-10-02T11:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.612933 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.612970 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.612981 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.612996 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.613008 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:46Z","lastTransitionTime":"2025-10-02T11:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.716361 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.716395 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.716407 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.716423 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.716434 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:46Z","lastTransitionTime":"2025-10-02T11:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.818613 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.818656 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.818665 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.818677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.818686 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:46Z","lastTransitionTime":"2025-10-02T11:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.921541 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.921600 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.921611 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.921627 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.921638 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:46Z","lastTransitionTime":"2025-10-02T11:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:46 crc kubenswrapper[4658]: I1002 11:19:46.949108 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:46 crc kubenswrapper[4658]: E1002 11:19:46.949334 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.024608 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.024653 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.024663 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.024679 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.024689 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.127076 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.127121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.127131 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.127146 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.127157 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.230885 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.230937 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.230950 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.230969 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.230981 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.333850 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.333939 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.333960 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.334189 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.334208 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.437131 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.437196 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.437209 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.437228 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.437241 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.539482 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.539527 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.539538 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.539555 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.539568 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.641637 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.641680 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.641688 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.641702 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.641711 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.744406 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.744451 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.744459 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.744474 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.744483 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.846758 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.846808 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.846819 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.846835 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.846847 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.948205 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.948240 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:47 crc kubenswrapper[4658]: E1002 11:19:47.948373 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.948415 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:47 crc kubenswrapper[4658]: E1002 11:19:47.948565 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:47 crc kubenswrapper[4658]: E1002 11:19:47.948629 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.949032 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.949107 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.949141 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.949167 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.949185 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:47Z","lastTransitionTime":"2025-10-02T11:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.949221 4658 scope.go:117] "RemoveContainer" containerID="e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.960124 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:47Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.969549 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:47Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.983995 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:47Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:47 crc kubenswrapper[4658]: I1002 11:19:47.995976 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:47Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.018897 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.030937 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.041707 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.051595 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.051820 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.051921 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.052025 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.052161 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.055751 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.068355 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.084755 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.098591 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.113540 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.130941 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.146773 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.155010 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.155045 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.155054 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.155068 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.155077 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.160258 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.161480 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.161524 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.161536 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.161551 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.161574 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.173618 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: E1002 11:19:48.174322 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.177446 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.177475 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.177484 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.177498 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.177507 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: E1002 11:19:48.188209 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.191107 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.191150 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.191160 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.191174 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.191184 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.191032 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:33Z\\\",\\\"message\\\":\\\"encies-checksum:sha256:f0beb9378fd30968608b370e3877d4c76f7539f11a5eebf44bee42a8b2dd7068 openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207264 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz: failed to check if pod openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207288 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz: failed to check if pod openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz is in primary UDN: could not find OVN pod annotation in map[]\\\\nE1002 11:19:33.292342 6084 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1002 11:19:33.293573 6084 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:19:33.293637 6084 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: E1002 11:19:48.202427 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.203845 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.206093 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.206130 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.206143 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.206161 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.206174 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: E1002 11:19:48.217259 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.220623 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.220651 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.220659 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.220673 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.220683 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: E1002 11:19:48.231481 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: E1002 11:19:48.232603 4658 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.256802 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.256845 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.256854 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.256870 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.256881 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.359603 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.359631 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.359640 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.359652 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.359662 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.375536 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:48 crc kubenswrapper[4658]: E1002 11:19:48.375724 4658 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:48 crc kubenswrapper[4658]: E1002 11:19:48.375820 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs podName:2ea83baf-570c-46db-ad98-aa9ec89d1c82 nodeName:}" failed. No retries permitted until 2025-10-02 11:20:20.375797001 +0000 UTC m=+101.266950638 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs") pod "network-metrics-daemon-6fxls" (UID: "2ea83baf-570c-46db-ad98-aa9ec89d1c82") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.393065 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/1.log" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.396118 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.396607 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.410583 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.429944 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.445001 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.456195 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.461580 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.461824 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.461920 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.461995 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.462060 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.477460 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:33Z\\\",\\\"message\\\":\\\"encies-checksum:sha256:f0beb9378fd30968608b370e3877d4c76f7539f11a5eebf44bee42a8b2dd7068 openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207264 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz: failed to check if pod openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207288 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz: failed to check if pod openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz is in primary UDN: could not find OVN pod annotation in map[]\\\\nE1002 11:19:33.292342 6084 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1002 11:19:33.293573 6084 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:19:33.293637 6084 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.488041 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.500769 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.561552 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.564758 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.564784 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.564792 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.564804 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.564815 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.573799 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.581805 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.603214 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.617418 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.629769 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.642612 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.666865 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.666907 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.666920 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.666937 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.666949 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.670692 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.682767 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.693240 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.702472 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.770258 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.770317 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.770331 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.770347 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.770359 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.873420 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.873468 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.873480 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.873496 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.873508 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.949119 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:48 crc kubenswrapper[4658]: E1002 11:19:48.949280 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.976153 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.976235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.976269 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.976340 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:48 crc kubenswrapper[4658]: I1002 11:19:48.976374 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:48Z","lastTransitionTime":"2025-10-02T11:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.078409 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.078438 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.078448 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.078463 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.078474 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:49Z","lastTransitionTime":"2025-10-02T11:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.181226 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.181270 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.181278 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.181311 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.181322 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:49Z","lastTransitionTime":"2025-10-02T11:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.284070 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.284110 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.284119 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.284132 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.284142 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:49Z","lastTransitionTime":"2025-10-02T11:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.386148 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.386221 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.386245 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.386273 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.386342 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:49Z","lastTransitionTime":"2025-10-02T11:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.402700 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/2.log" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.403896 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/1.log" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.407638 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba" exitCode=1 Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.407827 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.408008 4658 scope.go:117] "RemoveContainer" containerID="e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.408836 4658 scope.go:117] "RemoveContainer" containerID="1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba" Oct 02 11:19:49 crc kubenswrapper[4658]: E1002 11:19:49.409094 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.425011 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.435833 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.457925 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.476667 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.488675 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.488724 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.488740 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.488758 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.488770 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:49Z","lastTransitionTime":"2025-10-02T11:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.491916 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.507793 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.529511 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.541980 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.554558 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.566178 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.576829 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.591230 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.591269 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.591278 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.591307 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.591318 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:49Z","lastTransitionTime":"2025-10-02T11:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.594584 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.605414 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.616063 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.635067 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:33Z\\\",\\\"message\\\":\\\"encies-checksum:sha256:f0beb9378fd30968608b370e3877d4c76f7539f11a5eebf44bee42a8b2dd7068 openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207264 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz: failed to check if pod openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207288 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz: failed to check if pod openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz is in primary UDN: could not find OVN pod annotation in map[]\\\\nE1002 11:19:33.292342 6084 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1002 11:19:33.293573 6084 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:19:33.293637 6084 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z]\\\\nI1002 11:19:48.868714 6553 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.649105 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.662599 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.673133 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.693828 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.693857 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.693865 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.693879 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.693887 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:49Z","lastTransitionTime":"2025-10-02T11:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.797121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.797186 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.797204 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.797229 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.797248 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:49Z","lastTransitionTime":"2025-10-02T11:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.899931 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.899985 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.899993 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.900008 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.900018 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:49Z","lastTransitionTime":"2025-10-02T11:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.948977 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.949007 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:49 crc kubenswrapper[4658]: E1002 11:19:49.949158 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.949182 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:49 crc kubenswrapper[4658]: E1002 11:19:49.949279 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:49 crc kubenswrapper[4658]: E1002 11:19:49.949378 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.963894 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.980157 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:49 crc kubenswrapper[4658]: I1002 11:19:49.995550 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:49Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.002447 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.002582 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.002601 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.002627 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.002645 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.006159 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.017275 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.029877 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.041720 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.055203 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.073574 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1bd7855a4b2315e99517323eff3126d49c10ba339aad654380611e2576a2ef2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:33Z\\\",\\\"message\\\":\\\"encies-checksum:sha256:f0beb9378fd30968608b370e3877d4c76f7539f11a5eebf44bee42a8b2dd7068 openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207264 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz: failed to check if pod openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1002 11:19:33.207288 6084 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz: failed to check if pod openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz is in primary UDN: could not find OVN pod annotation in map[]\\\\nE1002 11:19:33.292342 6084 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1002 11:19:33.293573 6084 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:19:33.293637 6084 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z]\\\\nI1002 11:19:48.868714 6553 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.092628 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.104992 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.105033 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.105045 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.105061 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.105072 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.107611 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.118958 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.127360 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.139948 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.149481 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.160760 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.172781 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.182395 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.207462 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.207496 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.207507 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.207522 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.207535 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.309396 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.309452 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.309467 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.309488 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.309506 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.411690 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.411728 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.411739 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.411756 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.411767 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.412564 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/2.log" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.415457 4658 scope.go:117] "RemoveContainer" containerID="1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba" Oct 02 11:19:50 crc kubenswrapper[4658]: E1002 11:19:50.415584 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.416994 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/0.log" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.417019 4658 generic.go:334] "Generic (PLEG): container finished" podID="69a005aa-c7db-4d46-968b-8a9a0c00bbd5" containerID="fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385" exitCode=1 Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.417038 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-thtgx" event={"ID":"69a005aa-c7db-4d46-968b-8a9a0c00bbd5","Type":"ContainerDied","Data":"fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.417246 4658 scope.go:117] "RemoveContainer" containerID="fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.438561 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.452759 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.470175 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.485348 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.495678 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.512561 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.514972 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.514995 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.515004 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.515016 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.515025 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.527883 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.542184 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.557051 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.570226 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.583966 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.601013 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z]\\\\nI1002 11:19:48.868714 6553 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.619837 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.619970 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.619991 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.620019 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.620073 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.624089 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.639365 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.655935 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.672919 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.695428 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.706884 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.718288 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.722131 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.722168 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.722180 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.722198 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.722211 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.734544 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.748659 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.761625 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.782546 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z]\\\\nI1002 11:19:48.868714 6553 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.809154 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.824111 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.824354 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.824378 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.824388 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.824402 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.824410 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.835864 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.846641 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.862511 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:49Z\\\",\\\"message\\\":\\\"2025-10-02T11:19:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4\\\\n2025-10-02T11:19:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4 to /host/opt/cni/bin/\\\\n2025-10-02T11:19:04Z [verbose] multus-daemon started\\\\n2025-10-02T11:19:04Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:19:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.873364 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.888499 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.900325 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.910936 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.921833 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.926555 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.926594 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.926604 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.926618 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.926627 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:50Z","lastTransitionTime":"2025-10-02T11:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.932719 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.943592 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.948446 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:50 crc kubenswrapper[4658]: E1002 11:19:50.948687 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.951687 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:50 crc kubenswrapper[4658]: I1002 11:19:50.956174 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.029812 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.029868 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.029886 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.029912 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.029930 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.132953 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.133014 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.133036 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.133079 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.133134 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.235949 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.235986 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.235997 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.236013 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.236025 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.339506 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.339571 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.339582 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.339600 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.339612 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.421814 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/0.log" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.422027 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-thtgx" event={"ID":"69a005aa-c7db-4d46-968b-8a9a0c00bbd5","Type":"ContainerStarted","Data":"96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.433156 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.442596 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.442664 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.442677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.442728 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.442743 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.446848 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.457464 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.471491 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.484042 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.494721 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.521442 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z]\\\\nI1002 11:19:48.868714 6553 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.531145 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bd8e8d3-85ef-4048-ac6b-49921bde380c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23330935e83f85001f5fdca938b3fda718894207e685d2ac46b8c70606165702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.544714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.544902 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.544961 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.545089 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.545151 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.546500 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.566563 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.580867 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.598395 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:49Z\\\",\\\"message\\\":\\\"2025-10-02T11:19:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4\\\\n2025-10-02T11:19:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4 to /host/opt/cni/bin/\\\\n2025-10-02T11:19:04Z [verbose] multus-daemon started\\\\n2025-10-02T11:19:04Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:19:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.607709 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.623597 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.633239 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.643459 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.647283 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.647328 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.647340 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.647354 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.647364 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.662894 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.675737 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.685281 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:51Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.749789 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.750010 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.750073 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.750135 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.750290 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.853561 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.853602 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.853612 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.853627 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.853637 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.948880 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.948880 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:51 crc kubenswrapper[4658]: E1002 11:19:51.949015 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:51 crc kubenswrapper[4658]: E1002 11:19:51.949269 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.949532 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:51 crc kubenswrapper[4658]: E1002 11:19:51.949812 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.955250 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.955281 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.955303 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.955316 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:51 crc kubenswrapper[4658]: I1002 11:19:51.955327 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:51Z","lastTransitionTime":"2025-10-02T11:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.058091 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.058151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.058165 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.058184 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.058198 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.161177 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.161259 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.161285 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.161347 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.161371 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.263957 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.264020 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.264040 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.264064 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.264081 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.367810 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.367849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.367857 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.367870 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.367879 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.470150 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.470189 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.470197 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.470209 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.470218 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.573195 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.573289 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.573374 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.573409 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.573433 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.676426 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.676478 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.676494 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.676513 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.676526 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.779538 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.779597 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.779613 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.779631 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.779642 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.882462 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.882504 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.882516 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.882531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.882542 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.948622 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:52 crc kubenswrapper[4658]: E1002 11:19:52.948864 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.984225 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.984256 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.984265 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.984278 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:52 crc kubenswrapper[4658]: I1002 11:19:52.984286 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:52Z","lastTransitionTime":"2025-10-02T11:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.087273 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.087343 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.087358 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.087378 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.087392 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:53Z","lastTransitionTime":"2025-10-02T11:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.190800 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.190885 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.190909 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.190941 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.190964 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:53Z","lastTransitionTime":"2025-10-02T11:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.293815 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.293871 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.293888 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.293914 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.293934 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:53Z","lastTransitionTime":"2025-10-02T11:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.397268 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.397419 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.397447 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.397476 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.397496 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:53Z","lastTransitionTime":"2025-10-02T11:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.500121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.500163 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.500178 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.500196 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.500207 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:53Z","lastTransitionTime":"2025-10-02T11:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.602582 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.602640 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.602651 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.602670 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.602684 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:53Z","lastTransitionTime":"2025-10-02T11:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.706094 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.706140 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.706152 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.706170 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.706183 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:53Z","lastTransitionTime":"2025-10-02T11:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.809193 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.809254 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.809273 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.809332 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.809357 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:53Z","lastTransitionTime":"2025-10-02T11:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.911942 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.912013 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.912032 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.912057 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.912075 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:53Z","lastTransitionTime":"2025-10-02T11:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.948169 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.948169 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:53 crc kubenswrapper[4658]: E1002 11:19:53.948401 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:53 crc kubenswrapper[4658]: I1002 11:19:53.948417 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:53 crc kubenswrapper[4658]: E1002 11:19:53.948554 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:53 crc kubenswrapper[4658]: E1002 11:19:53.948672 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.015201 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.015258 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.015275 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.015318 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.015332 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.117926 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.117969 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.117978 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.118021 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.118040 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.221154 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.221216 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.221235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.221258 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.221273 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.324399 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.324449 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.324494 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.324526 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.324549 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.427473 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.427532 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.427551 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.427575 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.427592 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.530458 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.530512 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.530528 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.530558 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.530574 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.633991 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.634079 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.634100 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.634126 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.634177 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.736493 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.736529 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.736538 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.736551 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.736560 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.839482 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.839612 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.839630 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.839655 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.839676 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.942143 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.942197 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.942212 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.942260 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.942277 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:54Z","lastTransitionTime":"2025-10-02T11:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:54 crc kubenswrapper[4658]: I1002 11:19:54.948887 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:54 crc kubenswrapper[4658]: E1002 11:19:54.949061 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.045432 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.045503 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.045521 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.045546 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.045562 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.148433 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.148506 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.148528 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.148553 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.148574 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.252020 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.252096 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.252119 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.252149 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.252172 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.355025 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.355077 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.355090 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.355108 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.355122 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.458562 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.458631 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.458650 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.458673 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.458689 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.561706 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.561770 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.561782 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.561806 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.561821 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.665518 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.665574 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.665590 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.665609 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.665621 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.768735 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.768776 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.768787 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.768802 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.768812 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.871410 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.871447 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.871465 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.871481 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.871492 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.948271 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:55 crc kubenswrapper[4658]: E1002 11:19:55.948499 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.948607 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.948739 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:55 crc kubenswrapper[4658]: E1002 11:19:55.948895 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:55 crc kubenswrapper[4658]: E1002 11:19:55.949079 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.974435 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.974500 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.974560 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.974593 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:55 crc kubenswrapper[4658]: I1002 11:19:55.974614 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:55Z","lastTransitionTime":"2025-10-02T11:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.077592 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.077648 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.077667 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.077688 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.077706 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:56Z","lastTransitionTime":"2025-10-02T11:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.181336 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.181426 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.181452 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.181485 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.181520 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:56Z","lastTransitionTime":"2025-10-02T11:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.283848 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.283892 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.283902 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.283914 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.283923 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:56Z","lastTransitionTime":"2025-10-02T11:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.386418 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.386469 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.386478 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.386493 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.386503 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:56Z","lastTransitionTime":"2025-10-02T11:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.489693 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.489750 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.489759 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.489772 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.489782 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:56Z","lastTransitionTime":"2025-10-02T11:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.592557 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.592606 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.592614 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.592630 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.592641 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:56Z","lastTransitionTime":"2025-10-02T11:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.696347 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.696387 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.696400 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.696418 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.696431 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:56Z","lastTransitionTime":"2025-10-02T11:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.798404 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.798452 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.798464 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.798480 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.798494 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:56Z","lastTransitionTime":"2025-10-02T11:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.900954 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.901030 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.901055 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.901084 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.901108 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:56Z","lastTransitionTime":"2025-10-02T11:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:56 crc kubenswrapper[4658]: I1002 11:19:56.948362 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:56 crc kubenswrapper[4658]: E1002 11:19:56.948513 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.003958 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.004007 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.004020 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.004035 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.004048 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.106969 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.107016 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.107028 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.107044 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.107055 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.209706 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.209777 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.209805 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.209834 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.209856 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.313287 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.313361 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.313373 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.313392 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.313407 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.415811 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.415874 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.415889 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.415911 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.415926 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.518621 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.518715 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.518739 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.518771 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.518793 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.622232 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.622288 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.622346 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.622369 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.622384 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.724839 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.724889 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.724904 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.724924 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.724938 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.827598 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.827638 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.827647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.827661 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.827670 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.930450 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.930518 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.930535 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.930565 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.930582 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:57Z","lastTransitionTime":"2025-10-02T11:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.948848 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.948864 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:57 crc kubenswrapper[4658]: E1002 11:19:57.948969 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:57 crc kubenswrapper[4658]: I1002 11:19:57.948987 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:57 crc kubenswrapper[4658]: E1002 11:19:57.949072 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:57 crc kubenswrapper[4658]: E1002 11:19:57.949217 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.033911 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.033952 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.033961 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.033975 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.033987 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.136685 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.136745 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.136757 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.136777 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.136790 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.240007 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.240056 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.240065 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.240080 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.240093 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.342614 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.342666 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.342683 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.342707 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.342724 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.376981 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.377021 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.377030 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.377043 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.377052 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: E1002 11:19:58.398144 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:58Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.402629 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.402670 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.402681 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.402697 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.402709 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: E1002 11:19:58.419124 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:58Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.423614 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.423647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.423658 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.423676 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.423689 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: E1002 11:19:58.443682 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:58Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.449042 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.449096 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.449113 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.449140 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.449158 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: E1002 11:19:58.469222 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:58Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.474421 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.474475 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.474611 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.474639 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.474657 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: E1002 11:19:58.491485 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:58Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:58 crc kubenswrapper[4658]: E1002 11:19:58.491720 4658 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.494113 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.494175 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.494195 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.494222 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.494242 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.597789 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.597874 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.597899 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.597930 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.597950 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.700643 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.700677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.700687 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.700699 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.700707 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.802410 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.802473 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.802497 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.802525 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.802545 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.905561 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.905644 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.905671 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.905700 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.905720 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:58Z","lastTransitionTime":"2025-10-02T11:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:58 crc kubenswrapper[4658]: I1002 11:19:58.948168 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:19:58 crc kubenswrapper[4658]: E1002 11:19:58.948427 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.008655 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.008709 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.008724 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.008746 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.008761 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.111016 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.111045 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.111053 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.111067 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.111078 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.213445 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.213504 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.213513 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.213527 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.213538 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.316016 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.316073 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.316086 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.316103 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.316115 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.417889 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.417937 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.417949 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.417966 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.417978 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.520281 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.520340 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.520351 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.520369 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.520380 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.623211 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.623271 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.623289 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.623344 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.623362 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.725990 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.726038 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.726053 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.726072 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.726083 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.828006 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.828043 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.828054 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.828067 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.828077 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.930718 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.930776 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.930793 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.930818 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.930835 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:19:59Z","lastTransitionTime":"2025-10-02T11:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.948340 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.948485 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:19:59 crc kubenswrapper[4658]: E1002 11:19:59.948565 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.948668 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:19:59 crc kubenswrapper[4658]: E1002 11:19:59.948772 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:19:59 crc kubenswrapper[4658]: E1002 11:19:59.948882 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.967055 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:59Z is after 2025-08-24T17:21:41Z" Oct 02 11:19:59 crc kubenswrapper[4658]: I1002 11:19:59.986786 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:59Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.007230 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.024011 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.033033 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.033072 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.033081 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.033111 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.033120 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.035370 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.047108 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.058447 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.069947 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.079414 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.090628 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.108162 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z]\\\\nI1002 11:19:48.868714 6553 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.118407 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bd8e8d3-85ef-4048-ac6b-49921bde380c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23330935e83f85001f5fdca938b3fda718894207e685d2ac46b8c70606165702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.132122 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.135924 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.135980 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.135996 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.136018 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.136030 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.145873 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.155208 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.167631 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:49Z\\\",\\\"message\\\":\\\"2025-10-02T11:19:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4\\\\n2025-10-02T11:19:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4 to /host/opt/cni/bin/\\\\n2025-10-02T11:19:04Z [verbose] multus-daemon started\\\\n2025-10-02T11:19:04Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:19:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.177931 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.194552 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.204551 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.239431 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.239494 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.239510 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.239532 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.239551 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.342850 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.342917 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.342943 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.342975 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.342998 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.447157 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.447224 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.447248 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.447280 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.447335 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.550277 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.550480 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.550514 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.550602 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.550701 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.653499 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.653560 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.653581 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.653606 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.653624 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.756742 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.757354 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.757384 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.757421 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.757444 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.860837 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.860877 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.860889 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.860905 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.860917 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.948112 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:00 crc kubenswrapper[4658]: E1002 11:20:00.948253 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.964193 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.964244 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.964255 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.964274 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:00 crc kubenswrapper[4658]: I1002 11:20:00.964286 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:00Z","lastTransitionTime":"2025-10-02T11:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.067124 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.067193 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.067216 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.067246 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.067270 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:01Z","lastTransitionTime":"2025-10-02T11:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.170083 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.170117 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.170130 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.170144 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.170154 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:01Z","lastTransitionTime":"2025-10-02T11:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.273785 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.273857 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.273883 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.273911 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.273933 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:01Z","lastTransitionTime":"2025-10-02T11:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.376472 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.376538 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.376550 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.376567 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.376580 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:01Z","lastTransitionTime":"2025-10-02T11:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.479511 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.479592 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.479616 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.479647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.479669 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:01Z","lastTransitionTime":"2025-10-02T11:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.583084 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.583145 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.583185 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.583214 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.583234 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:01Z","lastTransitionTime":"2025-10-02T11:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.686566 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.686627 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.686649 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.686681 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.686705 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:01Z","lastTransitionTime":"2025-10-02T11:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.794075 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.794155 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.794180 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.794215 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.794252 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:01Z","lastTransitionTime":"2025-10-02T11:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.897888 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.897979 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.897999 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.898022 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.898040 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:01Z","lastTransitionTime":"2025-10-02T11:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.949052 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.949208 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:01 crc kubenswrapper[4658]: I1002 11:20:01.949447 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:01 crc kubenswrapper[4658]: E1002 11:20:01.949570 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:01 crc kubenswrapper[4658]: E1002 11:20:01.949617 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:01 crc kubenswrapper[4658]: E1002 11:20:01.949689 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.001121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.001178 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.001195 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.001217 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.001235 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.104189 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.104253 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.104268 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.104357 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.104376 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.207527 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.207607 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.207630 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.207660 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.207677 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.311263 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.311369 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.311394 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.311425 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.311444 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.413787 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.413836 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.413847 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.413863 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.413894 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.517514 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.517577 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.517596 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.517622 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.517639 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.620959 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.621017 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.621036 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.621064 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.621082 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.723587 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.723661 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.723681 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.723705 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.723724 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.826074 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.826123 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.826134 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.826152 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.826164 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.928585 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.928635 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.928652 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.928676 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.928691 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:02Z","lastTransitionTime":"2025-10-02T11:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:02 crc kubenswrapper[4658]: I1002 11:20:02.949112 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:02 crc kubenswrapper[4658]: E1002 11:20:02.949268 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.030980 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.031045 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.031057 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.031071 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.031085 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.134407 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.134476 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.134500 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.134528 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.134554 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.238083 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.238131 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.238148 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.238170 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.238187 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.340315 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.340458 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.340477 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.340497 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.340513 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.442905 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.442956 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.442970 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.442986 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.442996 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.545681 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.545728 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.545738 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.545755 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.545765 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.648607 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.648734 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.648800 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.648830 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.648853 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.752044 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.752100 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.752113 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.752136 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.752151 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.855181 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.855249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.855263 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.855290 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.855342 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.948969 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.949025 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.949055 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:03 crc kubenswrapper[4658]: E1002 11:20:03.949196 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:03 crc kubenswrapper[4658]: E1002 11:20:03.949364 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:03 crc kubenswrapper[4658]: E1002 11:20:03.949484 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.958347 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.958414 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.958436 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.958461 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:03 crc kubenswrapper[4658]: I1002 11:20:03.958479 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:03Z","lastTransitionTime":"2025-10-02T11:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.061715 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.061769 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.061786 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.061810 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.061827 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.164357 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.164440 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.164464 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.164498 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.164520 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.267783 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.267833 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.267852 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.267874 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.267892 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.370863 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.370935 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.370957 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.370983 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.371003 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.473439 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.473495 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.473516 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.473549 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.473572 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.575942 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.576007 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.576020 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.576040 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.576053 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.679147 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.679223 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.679247 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.679279 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.679331 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.761909 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.762010 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.762056 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762164 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:08.762134041 +0000 UTC m=+149.653287648 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.762220 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762237 4658 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.762267 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762354 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:21:08.762328377 +0000 UTC m=+149.653482014 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762412 4658 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762473 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762394 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762588 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762612 4658 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762510 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762705 4658 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762520 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:21:08.762495493 +0000 UTC m=+149.653649100 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762793 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:21:08.762766251 +0000 UTC m=+149.653919848 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.762838 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:21:08.762818433 +0000 UTC m=+149.653972160 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.782647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.782714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.782727 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.782744 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.782775 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.885372 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.885434 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.885454 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.885479 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.885497 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.949038 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.949362 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.950567 4658 scope.go:117] "RemoveContainer" containerID="1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba" Oct 02 11:20:04 crc kubenswrapper[4658]: E1002 11:20:04.950973 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.987646 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.987701 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.987717 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.987739 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:04 crc kubenswrapper[4658]: I1002 11:20:04.987756 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:04Z","lastTransitionTime":"2025-10-02T11:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.091228 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.091287 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.091341 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.091375 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.091393 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:05Z","lastTransitionTime":"2025-10-02T11:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.194409 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.194490 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.194512 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.194545 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.194569 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:05Z","lastTransitionTime":"2025-10-02T11:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.298069 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.298148 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.298166 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.298195 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.298218 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:05Z","lastTransitionTime":"2025-10-02T11:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.401387 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.401453 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.401470 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.401492 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.401511 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:05Z","lastTransitionTime":"2025-10-02T11:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.503647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.503714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.503738 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.503765 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.503787 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:05Z","lastTransitionTime":"2025-10-02T11:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.606995 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.607061 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.607084 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.607115 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.607138 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:05Z","lastTransitionTime":"2025-10-02T11:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.710060 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.710121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.710143 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.710171 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.710191 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:05Z","lastTransitionTime":"2025-10-02T11:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.813235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.813284 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.813320 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.813339 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.813350 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:05Z","lastTransitionTime":"2025-10-02T11:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.917373 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.917513 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.917531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.917554 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.917575 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:05Z","lastTransitionTime":"2025-10-02T11:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.949219 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.949227 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:05 crc kubenswrapper[4658]: I1002 11:20:05.949336 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:05 crc kubenswrapper[4658]: E1002 11:20:05.949746 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:05 crc kubenswrapper[4658]: E1002 11:20:05.949822 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:05 crc kubenswrapper[4658]: E1002 11:20:05.949884 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.020845 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.020907 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.020928 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.020957 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.020983 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.123746 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.123837 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.123856 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.123899 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.123929 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.226405 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.226463 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.226476 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.226497 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.226511 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.328960 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.329005 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.329014 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.329071 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.329083 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.431945 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.432000 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.432017 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.432040 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.432057 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.534562 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.534637 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.534657 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.534681 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.534702 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.638429 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.638500 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.638522 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.638547 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.638564 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.741924 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.741988 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.742005 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.742031 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.742050 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.844830 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.844886 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.844901 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.844923 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.844939 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.948087 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.948090 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.948327 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.948350 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:06 crc kubenswrapper[4658]: E1002 11:20:06.948353 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.948380 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:06 crc kubenswrapper[4658]: I1002 11:20:06.948407 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:06Z","lastTransitionTime":"2025-10-02T11:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.052100 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.052165 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.052182 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.052208 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.052221 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.154788 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.155031 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.155048 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.155072 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.155083 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.257842 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.257903 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.257934 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.257962 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.257986 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.361134 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.361180 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.361191 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.361214 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.361225 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.465673 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.465726 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.465736 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.465756 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.465767 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.569086 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.569151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.569162 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.569183 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.569196 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.671653 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.672111 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.672122 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.672139 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.672153 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.774658 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.774723 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.774731 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.774746 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.774755 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.878356 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.878407 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.878420 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.878437 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.878449 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.948482 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.948604 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.948858 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:07 crc kubenswrapper[4658]: E1002 11:20:07.948942 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:07 crc kubenswrapper[4658]: E1002 11:20:07.949078 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:07 crc kubenswrapper[4658]: E1002 11:20:07.949159 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.981625 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.981682 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.981694 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.981714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:07 crc kubenswrapper[4658]: I1002 11:20:07.981732 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:07Z","lastTransitionTime":"2025-10-02T11:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.083956 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.084019 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.084035 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.084054 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.084067 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.187564 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.187620 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.187632 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.187647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.187655 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.290494 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.290565 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.290582 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.290608 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.290625 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.393697 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.393781 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.393810 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.393843 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.393868 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.496235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.496366 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.496390 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.496418 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.496440 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.599562 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.599616 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.599628 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.599645 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.599656 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.667802 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.667862 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.667873 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.667895 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.667910 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: E1002 11:20:08.685462 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.690408 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.690459 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.690471 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.690490 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.690504 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: E1002 11:20:08.712201 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.716461 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.716517 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.716529 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.716547 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.716557 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: E1002 11:20:08.769000 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.778507 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.778579 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.778589 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.778605 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.778616 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: E1002 11:20:08.795953 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.800151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.800183 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.800191 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.800206 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.800218 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: E1002 11:20:08.814709 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:08Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:08 crc kubenswrapper[4658]: E1002 11:20:08.814821 4658 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.816673 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.816701 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.816712 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.816728 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.816738 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.918730 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.918788 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.918798 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.918811 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.918819 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:08Z","lastTransitionTime":"2025-10-02T11:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:08 crc kubenswrapper[4658]: I1002 11:20:08.948561 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:08 crc kubenswrapper[4658]: E1002 11:20:08.948762 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.021434 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.021503 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.021527 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.021554 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.021572 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.125110 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.125242 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.125262 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.125287 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.125329 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.228037 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.228112 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.228135 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.228166 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.228187 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.331672 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.331759 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.331784 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.331815 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.331839 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.435236 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.435340 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.435369 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.435401 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.435424 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.538425 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.538529 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.538551 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.538578 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.538596 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.641450 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.641494 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.641504 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.641520 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.641532 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.744910 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.744996 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.745021 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.745057 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.745085 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.848069 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.848122 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.848134 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.848152 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.848166 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.948222 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.948414 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:09 crc kubenswrapper[4658]: E1002 11:20:09.948562 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.948641 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:09 crc kubenswrapper[4658]: E1002 11:20:09.948852 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:09 crc kubenswrapper[4658]: E1002 11:20:09.948956 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.951591 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.951638 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.951652 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.951671 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.951683 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:09Z","lastTransitionTime":"2025-10-02T11:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.971163 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.984605 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:09 crc kubenswrapper[4658]: I1002 11:20:09.999655 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:09Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.015399 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.034173 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:49Z\\\",\\\"message\\\":\\\"2025-10-02T11:19:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4\\\\n2025-10-02T11:19:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4 to /host/opt/cni/bin/\\\\n2025-10-02T11:19:04Z [verbose] multus-daemon started\\\\n2025-10-02T11:19:04Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:19:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.047975 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.053882 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.053921 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.053936 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.053952 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.053964 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.068738 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.085679 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.105384 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.123686 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.138772 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.157323 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.157387 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.157407 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.157432 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.157449 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.157725 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.174973 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.189748 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bd8e8d3-85ef-4048-ac6b-49921bde380c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23330935e83f85001f5fdca938b3fda718894207e685d2ac46b8c70606165702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.206279 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.224665 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.239281 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.255673 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.259642 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.259677 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.259688 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.259703 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.259713 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.278959 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z]\\\\nI1002 11:19:48.868714 6553 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.362883 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.362975 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.363023 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.363047 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.363063 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.466214 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.466341 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.466403 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.466430 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.466482 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.568830 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.568890 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.568912 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.568938 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.568966 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.672017 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.672074 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.672092 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.672116 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.672132 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.774398 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.774457 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.774475 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.774498 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.774518 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.876997 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.877064 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.877076 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.877094 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.877106 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.949138 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:10 crc kubenswrapper[4658]: E1002 11:20:10.949378 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.980647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.980712 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.980733 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.980757 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:10 crc kubenswrapper[4658]: I1002 11:20:10.980774 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:10Z","lastTransitionTime":"2025-10-02T11:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.083565 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.083607 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.083618 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.083634 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.083645 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:11Z","lastTransitionTime":"2025-10-02T11:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.186691 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.186792 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.186823 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.186859 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.186884 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:11Z","lastTransitionTime":"2025-10-02T11:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.301639 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.301693 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.301706 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.301725 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.301738 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:11Z","lastTransitionTime":"2025-10-02T11:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.405727 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.405814 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.405827 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.405874 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.405892 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:11Z","lastTransitionTime":"2025-10-02T11:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.508615 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.508683 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.508697 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.508719 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.508733 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:11Z","lastTransitionTime":"2025-10-02T11:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.612345 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.612400 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.612415 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.612435 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.612448 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:11Z","lastTransitionTime":"2025-10-02T11:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.715504 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.715569 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.715583 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.715606 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.715623 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:11Z","lastTransitionTime":"2025-10-02T11:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.818999 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.819069 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.819126 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.819151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.819169 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:11Z","lastTransitionTime":"2025-10-02T11:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.922450 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.922518 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.922535 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.922563 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.922581 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:11Z","lastTransitionTime":"2025-10-02T11:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.948498 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.948592 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:11 crc kubenswrapper[4658]: I1002 11:20:11.948498 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:11 crc kubenswrapper[4658]: E1002 11:20:11.948748 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:11 crc kubenswrapper[4658]: E1002 11:20:11.948891 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:11 crc kubenswrapper[4658]: E1002 11:20:11.949086 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.026056 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.026129 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.026151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.026179 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.026198 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.129480 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.129551 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.129568 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.129597 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.129616 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.233098 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.233163 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.233181 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.233204 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.233222 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.340450 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.340531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.340553 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.340583 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.340604 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.444852 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.444917 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.444929 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.444947 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.444958 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.548708 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.548763 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.548773 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.548795 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.548808 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.652146 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.652215 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.652233 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.652264 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.652283 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.755587 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.755655 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.755672 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.755694 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.755710 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.858687 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.858745 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.858756 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.858773 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.858785 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.948853 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:12 crc kubenswrapper[4658]: E1002 11:20:12.949025 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.961888 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.961925 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.961935 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.961950 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:12 crc kubenswrapper[4658]: I1002 11:20:12.961959 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:12Z","lastTransitionTime":"2025-10-02T11:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.065594 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.065651 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.065670 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.065697 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.065713 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.168447 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.168552 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.168585 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.168622 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.168646 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.271770 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.271813 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.271822 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.271838 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.271849 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.374762 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.375110 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.375336 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.375556 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.375732 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.479970 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.480031 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.480054 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.480084 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.480102 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.583351 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.583428 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.583448 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.583475 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.583492 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.687596 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.687693 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.687705 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.687733 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.687751 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.790805 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.790848 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.790861 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.790879 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.790891 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.895328 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.895388 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.895399 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.895422 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.895434 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.948287 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.948405 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:13 crc kubenswrapper[4658]: E1002 11:20:13.948573 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:13 crc kubenswrapper[4658]: E1002 11:20:13.948751 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.948857 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:13 crc kubenswrapper[4658]: E1002 11:20:13.949069 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.998536 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.998640 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.998670 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.998709 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:13 crc kubenswrapper[4658]: I1002 11:20:13.998737 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:13Z","lastTransitionTime":"2025-10-02T11:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.101909 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.101987 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.102027 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.102048 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.102066 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:14Z","lastTransitionTime":"2025-10-02T11:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.204963 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.205044 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.205066 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.205090 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.205104 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:14Z","lastTransitionTime":"2025-10-02T11:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.308170 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.308242 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.308261 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.308287 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.308336 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:14Z","lastTransitionTime":"2025-10-02T11:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.411468 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.411553 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.411577 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.411607 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.411626 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:14Z","lastTransitionTime":"2025-10-02T11:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.514535 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.514604 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.514628 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.514659 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.514680 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:14Z","lastTransitionTime":"2025-10-02T11:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.618101 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.618170 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.618187 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.618212 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.618229 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:14Z","lastTransitionTime":"2025-10-02T11:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.721165 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.721214 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.721229 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.721249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.721261 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:14Z","lastTransitionTime":"2025-10-02T11:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.824998 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.825042 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.825057 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.825076 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.825090 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:14Z","lastTransitionTime":"2025-10-02T11:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.928798 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.928875 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.928897 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.928926 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.928944 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:14Z","lastTransitionTime":"2025-10-02T11:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:14 crc kubenswrapper[4658]: I1002 11:20:14.948707 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:14 crc kubenswrapper[4658]: E1002 11:20:14.948887 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.032980 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.033070 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.033101 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.033121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.033131 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.135528 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.135589 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.135603 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.135630 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.135647 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.238632 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.238680 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.238690 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.238707 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.238721 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.341616 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.341664 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.341675 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.341696 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.341708 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.444900 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.444942 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.444955 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.444971 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.444981 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.548113 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.548192 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.548210 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.548228 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.548243 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.651163 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.651261 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.651274 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.651289 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.651340 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.754332 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.754379 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.754391 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.754416 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.754427 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.857582 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.857657 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.857668 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.857690 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.857703 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.948614 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.948740 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.948668 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:15 crc kubenswrapper[4658]: E1002 11:20:15.948905 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:15 crc kubenswrapper[4658]: E1002 11:20:15.949060 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:15 crc kubenswrapper[4658]: E1002 11:20:15.949228 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.959888 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.959922 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.959930 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.959946 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:15 crc kubenswrapper[4658]: I1002 11:20:15.959956 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:15Z","lastTransitionTime":"2025-10-02T11:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.062793 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.062846 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.062859 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.062880 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.062893 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.165943 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.166001 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.166014 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.166033 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.166047 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.269834 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.269896 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.269908 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.269928 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.269941 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.373666 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.373753 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.373775 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.373805 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.373824 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.477037 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.477102 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.477114 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.477136 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.477152 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.580877 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.580956 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.580975 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.581002 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.581025 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.683849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.683914 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.683927 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.683949 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.683963 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.790776 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.790830 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.790841 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.790859 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.790872 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.894157 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.894222 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.894235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.894256 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.894271 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.949087 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:16 crc kubenswrapper[4658]: E1002 11:20:16.949335 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.950254 4658 scope.go:117] "RemoveContainer" containerID="1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.997169 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.997243 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.997263 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.997291 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:16 crc kubenswrapper[4658]: I1002 11:20:16.997377 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:16Z","lastTransitionTime":"2025-10-02T11:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.100152 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.100205 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.100214 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.100229 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.100242 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:17Z","lastTransitionTime":"2025-10-02T11:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.202951 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.202996 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.203007 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.203024 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.203035 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:17Z","lastTransitionTime":"2025-10-02T11:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.307919 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.308658 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.308693 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.308726 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.308748 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:17Z","lastTransitionTime":"2025-10-02T11:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.411857 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.411917 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.411934 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.411983 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.411999 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:17Z","lastTransitionTime":"2025-10-02T11:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.514734 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.514773 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.514785 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.514803 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.514816 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:17Z","lastTransitionTime":"2025-10-02T11:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.523061 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/2.log" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.526936 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.527511 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.548336 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.568499 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.584091 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.605963 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z]\\\\nI1002 11:19:48.868714 6553 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:20:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.617061 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.617114 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.617129 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.617153 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.617170 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:17Z","lastTransitionTime":"2025-10-02T11:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.618874 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bd8e8d3-85ef-4048-ac6b-49921bde380c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23330935e83f85001f5fdca938b3fda718894207e685d2ac46b8c70606165702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.632774 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.647611 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.660510 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.679337 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:49Z\\\",\\\"message\\\":\\\"2025-10-02T11:19:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4\\\\n2025-10-02T11:19:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4 to /host/opt/cni/bin/\\\\n2025-10-02T11:19:04Z [verbose] multus-daemon started\\\\n2025-10-02T11:19:04Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:19:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.700166 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.720249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.720307 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.720316 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.720331 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.720343 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:17Z","lastTransitionTime":"2025-10-02T11:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.727153 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.738101 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.749529 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.762687 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.775411 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.789963 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.801897 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.815648 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.822321 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.822372 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.822383 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.822401 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.822417 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:17Z","lastTransitionTime":"2025-10-02T11:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.830934 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.925842 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.925899 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.925912 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.925935 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.925948 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:17Z","lastTransitionTime":"2025-10-02T11:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.948535 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.948547 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:17 crc kubenswrapper[4658]: I1002 11:20:17.948557 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:17 crc kubenswrapper[4658]: E1002 11:20:17.948761 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:17 crc kubenswrapper[4658]: E1002 11:20:17.948965 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:17 crc kubenswrapper[4658]: E1002 11:20:17.949266 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.029064 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.029146 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.029170 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.029202 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.029226 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.132819 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.132902 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.132921 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.132944 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.132963 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.236236 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.236336 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.236355 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.236385 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.236402 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.339134 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.339209 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.339242 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.339271 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.339324 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.442266 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.442366 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.442385 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.442410 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.442427 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.533892 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/3.log" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.535102 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/2.log" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.539259 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023" exitCode=1 Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.539357 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.539430 4658 scope.go:117] "RemoveContainer" containerID="1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.540407 4658 scope.go:117] "RemoveContainer" containerID="8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023" Oct 02 11:20:18 crc kubenswrapper[4658]: E1002 11:20:18.540768 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.547856 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.547936 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.547959 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.547994 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.548019 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.560112 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.586457 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.601138 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.623389 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.637396 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.652554 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.652589 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.652597 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.652611 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.652623 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.659149 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.674600 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.693453 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.711796 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.731004 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.755236 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1de08493bf6c30b2ab547d1c9e3151d752210e7940f519796453cb3d48234eba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:48Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:19:48Z is after 2025-08-24T17:21:41Z]\\\\nI1002 11:19:48.868714 6553 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:20:17Z\\\",\\\"message\\\":\\\"74b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 11:20:17.905158 6891 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 11:20:17.905203 6891 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manage\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:20:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.755928 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.755972 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.755984 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.756000 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.756011 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.768582 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bd8e8d3-85ef-4048-ac6b-49921bde380c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23330935e83f85001f5fdca938b3fda718894207e685d2ac46b8c70606165702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.785905 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.798900 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.811602 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.825111 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:49Z\\\",\\\"message\\\":\\\"2025-10-02T11:19:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4\\\\n2025-10-02T11:19:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4 to /host/opt/cni/bin/\\\\n2025-10-02T11:19:04Z [verbose] multus-daemon started\\\\n2025-10-02T11:19:04Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:19:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.838183 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.858167 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.858210 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.858222 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.858241 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.858255 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.859760 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.869978 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.947796 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.947880 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.947909 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.947942 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.947964 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.948107 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:18 crc kubenswrapper[4658]: E1002 11:20:18.948217 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:18 crc kubenswrapper[4658]: E1002 11:20:18.964754 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.968593 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.968645 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.968661 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.968683 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:18 crc kubenswrapper[4658]: I1002 11:20:18.968700 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:18Z","lastTransitionTime":"2025-10-02T11:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: E1002 11:20:19.024247 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.030155 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.030185 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.030193 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.030206 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.030215 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: E1002 11:20:19.046214 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.049520 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.049554 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.049564 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.049581 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.049592 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: E1002 11:20:19.063262 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.070762 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.070799 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.070809 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.070825 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.070836 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: E1002 11:20:19.084313 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: E1002 11:20:19.084489 4658 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.086678 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.086750 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.086790 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.086871 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.086888 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.190476 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.190545 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.190565 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.190594 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.190612 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.293151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.293229 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.293239 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.293254 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.293264 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.395708 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.395775 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.395791 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.395821 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.395838 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.498812 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.498889 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.498913 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.498935 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.498952 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.545209 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/3.log" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.549284 4658 scope.go:117] "RemoveContainer" containerID="8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023" Oct 02 11:20:19 crc kubenswrapper[4658]: E1002 11:20:19.549496 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.564923 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.580083 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.592232 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.602178 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.602225 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.602234 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.602249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.602260 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.608559 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.622437 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.635473 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.655331 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:20:17Z\\\",\\\"message\\\":\\\"74b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 11:20:17.905158 6891 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 11:20:17.905203 6891 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manage\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:20:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.666163 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bd8e8d3-85ef-4048-ac6b-49921bde380c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23330935e83f85001f5fdca938b3fda718894207e685d2ac46b8c70606165702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.677787 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.690128 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.700152 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.704531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.704563 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.704572 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.704587 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.704600 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.714746 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:49Z\\\",\\\"message\\\":\\\"2025-10-02T11:19:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4\\\\n2025-10-02T11:19:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4 to /host/opt/cni/bin/\\\\n2025-10-02T11:19:04Z [verbose] multus-daemon started\\\\n2025-10-02T11:19:04Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:19:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.723927 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.743517 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.754867 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.765199 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.779615 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.790637 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.800205 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.806964 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.806998 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.807010 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.807026 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.807036 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.909858 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.909910 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.909923 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.909943 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.909956 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:19Z","lastTransitionTime":"2025-10-02T11:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.948620 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.948715 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:19 crc kubenswrapper[4658]: E1002 11:20:19.948822 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:19 crc kubenswrapper[4658]: E1002 11:20:19.948896 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.949254 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:19 crc kubenswrapper[4658]: E1002 11:20:19.949462 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.964695 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bd8e8d3-85ef-4048-ac6b-49921bde380c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23330935e83f85001f5fdca938b3fda718894207e685d2ac46b8c70606165702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.977973 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:19 crc kubenswrapper[4658]: I1002 11:20:19.992288 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.006384 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.011875 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.011916 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.011929 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.011947 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.011962 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.025106 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.062200 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:20:17Z\\\",\\\"message\\\":\\\"74b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 11:20:17.905158 6891 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 11:20:17.905203 6891 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manage\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:20:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.083986 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.098162 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.111172 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.115269 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.115382 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.115407 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.115435 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.115455 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.122986 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.138578 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:49Z\\\",\\\"message\\\":\\\"2025-10-02T11:19:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4\\\\n2025-10-02T11:19:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4 to /host/opt/cni/bin/\\\\n2025-10-02T11:19:04Z [verbose] multus-daemon started\\\\n2025-10-02T11:19:04Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:19:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.151982 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.172895 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.188862 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.203709 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.218446 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.218515 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.218529 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.218546 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.218581 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.222750 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.237745 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.253176 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.266889 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.321206 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.321321 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.321340 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.321360 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.321374 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.423721 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.423767 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.423779 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.423801 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.423813 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.440431 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:20 crc kubenswrapper[4658]: E1002 11:20:20.440583 4658 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:20:20 crc kubenswrapper[4658]: E1002 11:20:20.440667 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs podName:2ea83baf-570c-46db-ad98-aa9ec89d1c82 nodeName:}" failed. No retries permitted until 2025-10-02 11:21:24.440644343 +0000 UTC m=+165.331797920 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs") pod "network-metrics-daemon-6fxls" (UID: "2ea83baf-570c-46db-ad98-aa9ec89d1c82") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.526731 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.526792 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.526815 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.526843 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.526856 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.629997 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.630054 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.630072 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.630096 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.630112 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.733326 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.733369 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.733378 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.733397 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.733407 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.836707 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.836811 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.836828 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.837057 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.837075 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.939824 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.939887 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.939905 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.939931 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.939950 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:20Z","lastTransitionTime":"2025-10-02T11:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:20 crc kubenswrapper[4658]: I1002 11:20:20.948165 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:20 crc kubenswrapper[4658]: E1002 11:20:20.948392 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.043121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.043156 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.043166 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.043180 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.043190 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.145807 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.145848 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.145858 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.145873 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.145883 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.248244 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.248392 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.248419 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.248454 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.248482 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.350465 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.350508 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.350520 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.350536 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.350547 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.452669 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.452726 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.452744 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.452775 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.452796 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.554502 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.554551 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.554565 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.554586 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.554601 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.657731 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.657776 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.657787 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.657805 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.657818 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.760003 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.760069 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.760092 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.760121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.760142 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.862989 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.863064 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.863102 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.863134 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.863158 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.949242 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.949337 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.949417 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:21 crc kubenswrapper[4658]: E1002 11:20:21.949532 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:21 crc kubenswrapper[4658]: E1002 11:20:21.949663 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:21 crc kubenswrapper[4658]: E1002 11:20:21.949806 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.966238 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.966284 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.966314 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.966337 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:21 crc kubenswrapper[4658]: I1002 11:20:21.966349 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:21Z","lastTransitionTime":"2025-10-02T11:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.069065 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.069108 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.069123 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.069143 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.069162 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.171710 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.171764 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.171781 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.171805 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.171822 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.274780 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.274820 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.274833 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.274850 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.274863 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.377448 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.377509 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.377524 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.377545 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.377560 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.480553 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.480639 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.480655 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.480680 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.480702 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.583849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.583929 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.583953 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.583988 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.584015 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.690160 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.690197 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.690206 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.690221 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.690231 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.792926 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.792972 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.792983 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.793002 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.793016 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.895704 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.895785 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.895824 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.895849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.895864 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.948832 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:22 crc kubenswrapper[4658]: E1002 11:20:22.949093 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.998176 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.998279 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.998316 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.998341 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:22 crc kubenswrapper[4658]: I1002 11:20:22.998354 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:22Z","lastTransitionTime":"2025-10-02T11:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.101208 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.101265 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.101284 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.101352 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.101371 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:23Z","lastTransitionTime":"2025-10-02T11:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.204763 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.204817 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.204833 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.204855 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.204869 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:23Z","lastTransitionTime":"2025-10-02T11:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.308001 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.308074 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.308094 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.308122 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.308143 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:23Z","lastTransitionTime":"2025-10-02T11:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.410453 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.410496 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.410508 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.410527 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.410540 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:23Z","lastTransitionTime":"2025-10-02T11:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.513562 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.514462 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.514712 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.514929 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.515135 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:23Z","lastTransitionTime":"2025-10-02T11:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.617698 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.617737 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.617764 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.617787 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.617802 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:23Z","lastTransitionTime":"2025-10-02T11:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.720427 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.720474 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.720486 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.720504 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.720515 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:23Z","lastTransitionTime":"2025-10-02T11:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.823174 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.823235 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.823247 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.823266 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.823276 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:23Z","lastTransitionTime":"2025-10-02T11:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.928369 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.928443 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.928921 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.928947 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.928959 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:23Z","lastTransitionTime":"2025-10-02T11:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.948726 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.948833 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:23 crc kubenswrapper[4658]: E1002 11:20:23.948903 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:23 crc kubenswrapper[4658]: I1002 11:20:23.948753 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:23 crc kubenswrapper[4658]: E1002 11:20:23.948997 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:23 crc kubenswrapper[4658]: E1002 11:20:23.949061 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.031566 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.031601 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.031613 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.031631 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.031643 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.133867 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.133927 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.133941 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.133960 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.133972 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.237168 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.237215 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.237223 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.237241 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.237250 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.339843 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.339888 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.339896 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.339910 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.339921 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.443455 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.443510 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.443521 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.443539 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.443552 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.546474 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.546537 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.546547 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.546567 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.546577 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.649961 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.650004 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.650012 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.650029 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.650040 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.752320 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.752375 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.752389 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.752407 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.752421 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.854595 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.854626 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.854634 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.854651 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.854670 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.948349 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:24 crc kubenswrapper[4658]: E1002 11:20:24.948521 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.956990 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.957052 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.957074 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.957101 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:24 crc kubenswrapper[4658]: I1002 11:20:24.957126 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:24Z","lastTransitionTime":"2025-10-02T11:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.059121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.059162 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.059173 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.059188 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.059197 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.162089 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.162136 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.162148 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.162165 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.162178 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.264703 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.265044 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.265182 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.265365 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.265491 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.369050 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.369381 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.369463 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.369549 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.369612 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.472862 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.472898 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.472908 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.472926 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.472936 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.576227 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.576272 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.576281 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.576314 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.576325 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.679465 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.679944 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.680038 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.680150 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.680235 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.782859 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.783279 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.783423 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.783513 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.783605 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.886766 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.887050 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.887660 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.887749 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.887827 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.949188 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.949247 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:25 crc kubenswrapper[4658]: E1002 11:20:25.949744 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.949317 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:25 crc kubenswrapper[4658]: E1002 11:20:25.950005 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:25 crc kubenswrapper[4658]: E1002 11:20:25.949830 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.989987 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.990061 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.990086 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.990115 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:25 crc kubenswrapper[4658]: I1002 11:20:25.990135 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:25Z","lastTransitionTime":"2025-10-02T11:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.092696 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.092741 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.092753 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.092771 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.092783 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:26Z","lastTransitionTime":"2025-10-02T11:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.195123 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.195164 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.195176 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.195191 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.195202 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:26Z","lastTransitionTime":"2025-10-02T11:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.297830 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.297883 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.297893 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.297914 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.297924 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:26Z","lastTransitionTime":"2025-10-02T11:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.401887 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.401957 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.401978 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.402003 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.402026 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:26Z","lastTransitionTime":"2025-10-02T11:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.505054 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.505099 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.505109 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.505125 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.505135 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:26Z","lastTransitionTime":"2025-10-02T11:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.608046 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.608127 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.608151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.608183 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.608204 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:26Z","lastTransitionTime":"2025-10-02T11:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.710311 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.710362 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.710372 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.710388 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.710397 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:26Z","lastTransitionTime":"2025-10-02T11:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.812968 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.813034 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.813051 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.813076 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.813093 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:26Z","lastTransitionTime":"2025-10-02T11:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.915855 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.915904 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.915914 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.915930 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.915957 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:26Z","lastTransitionTime":"2025-10-02T11:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:26 crc kubenswrapper[4658]: I1002 11:20:26.948185 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:26 crc kubenswrapper[4658]: E1002 11:20:26.948326 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.018167 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.018258 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.018283 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.018340 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.018357 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.121003 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.121071 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.121084 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.121103 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.121114 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.223775 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.223871 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.223897 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.223932 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.223957 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.327080 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.327132 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.327143 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.327161 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.327175 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.430048 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.430114 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.430137 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.430166 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.430184 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.532763 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.532848 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.532875 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.532908 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.532934 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.635562 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.635643 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.635662 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.635687 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.635704 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.738311 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.739194 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.739353 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.739470 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.739657 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.842759 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.842823 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.842841 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.842872 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.842899 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.945630 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.945978 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.946129 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.946275 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.946447 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:27Z","lastTransitionTime":"2025-10-02T11:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.949176 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.949570 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:27 crc kubenswrapper[4658]: I1002 11:20:27.949550 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:27 crc kubenswrapper[4658]: E1002 11:20:27.949805 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:27 crc kubenswrapper[4658]: E1002 11:20:27.949889 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:27 crc kubenswrapper[4658]: E1002 11:20:27.949998 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.049379 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.049438 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.049452 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.049476 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.049489 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.152847 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.152896 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.152905 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.152923 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.152932 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.255521 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.255561 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.255574 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.255591 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.255602 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.358407 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.358476 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.358492 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.358519 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.358537 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.461790 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.461837 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.461847 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.461863 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.461874 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.564727 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.564808 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.564819 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.564839 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.564852 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.667200 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.667251 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.667263 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.667282 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.667314 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.769458 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.769507 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.769518 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.769539 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.769551 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.872810 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.872884 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.872903 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.872929 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.872955 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.948398 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:28 crc kubenswrapper[4658]: E1002 11:20:28.948655 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.981990 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.982059 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.982077 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.982101 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:28 crc kubenswrapper[4658]: I1002 11:20:28.982118 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:28Z","lastTransitionTime":"2025-10-02T11:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.084746 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.084824 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.084843 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.084870 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.084889 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.188428 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.188523 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.188537 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.188581 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.188599 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.291327 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.291377 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.291387 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.291408 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.291423 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.292776 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.292836 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.292852 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.292882 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.292894 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: E1002 11:20:29.312202 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.317643 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.317700 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.317716 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.317738 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.317752 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: E1002 11:20:29.334435 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.339434 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.339497 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.339518 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.339549 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.339571 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: E1002 11:20:29.359203 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.365871 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.365948 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.366009 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.366037 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.366340 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: E1002 11:20:29.389636 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.394851 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.394917 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.394936 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.394960 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.394978 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: E1002 11:20:29.411521 4658 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:20:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"989d2d6c-e7aa-470a-8c4e-33361ee1def6\\\",\\\"systemUUID\\\":\\\"6a661c31-2aab-46f6-9356-aadb249c199d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:29 crc kubenswrapper[4658]: E1002 11:20:29.411686 4658 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.413317 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.413342 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.413352 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.413368 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.413378 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.516040 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.516109 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.516126 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.516150 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.516168 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.619812 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.619879 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.619898 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.620389 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.620423 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.722175 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.722217 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.722227 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.722244 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.722255 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.824519 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.824591 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.824605 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.824646 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.824661 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.927170 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.927244 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.927261 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.927292 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.927337 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:29Z","lastTransitionTime":"2025-10-02T11:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.949023 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.949176 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:29 crc kubenswrapper[4658]: E1002 11:20:29.949394 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.949655 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:29 crc kubenswrapper[4658]: E1002 11:20:29.949736 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:29 crc kubenswrapper[4658]: E1002 11:20:29.949876 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.970669 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee62a7fae87d6097b10f65d011bdf0b17fa10db42f2ae8f9a6d9dd8f7887f6ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2534b47b51f7a716cfd473ebde1d094b212111423c7426741704b132ce00c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:29 crc kubenswrapper[4658]: I1002 11:20:29.986887 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53173b86-be4f-4b39-8f70-f7282ab529fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9caee2e90b7547c65e24af3e663746e37a9a524493423a5d9ba4528572009ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hqhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pnjp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.007466 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnfts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894543ca-6e44-42e8-b41b-4578646d527f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7e18733fee98e5a81cc2e2d08cf2606585db88ed4ad316110ec524db875f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e441591760c03455a4bd79eb66a03f8d5ec5ddfb7f1a2e6755bb4390bec7c7b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c1ff82088b2087909205c3daa150b19aa6e5fb4e4513807b673e51c244721e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db38ddda0be37bdcbdc5b604f9da6d0fda8e862ad4fb3fcb12cefb017f6aa8cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668dfe27432c3dda7a490d575b4dc4cbe9a9384afb9b408065a6e68768c71c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27fffcc81c09e77d975d429803a7785debd6a73a8042cb36f61d063055c26c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://079e83dc22378fb85865229056c4f3ac0427eebb030c0e8768edc5c1fb458554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb7ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnfts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.022928 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6fxls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ea83baf-570c-46db-ad98-aa9ec89d1c82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xq5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6fxls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.031475 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.031531 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.031590 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.031615 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.031634 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.037675 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bd8e8d3-85ef-4048-ac6b-49921bde380c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23330935e83f85001f5fdca938b3fda718894207e685d2ac46b8c70606165702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4003d10d29a8d8ee336f3a08ae7fdb315f923c34208217cf3d4b77a3e85bbcb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.054728 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16354faf-76c6-49d8-9053-23e9a97e9f0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a338dad7821bcf41b6354132a5b0f304fe120b29de7a976b75a704d49ae7956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077841c0e48651d65be1655e6876849fd3ac9636dd4dfad09bb8842ab0c24838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cff1efa26caf7771576b8bd2f83a22b0614bbb5f4a57047a5bf960a067fcf9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0735ea156bfc6925a3a82d1cb10f1e5bac2a0f1cf7735cdcfdadb0150d1e6180\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.074523 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://998993b17e03e46bc4581bc855eac8891a191e3d1d3b22965fb56fc10880449b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.090102 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.105748 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.134883 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.134929 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.134942 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.134960 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.134973 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.137131 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dea12458-2637-446e-b388-4f139b3fd000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:20:17Z\\\",\\\"message\\\":\\\"74b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 11:20:17.905158 6891 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 11:20:17.905203 6891 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manage\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:20:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8hnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2t8w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.168564 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7797c654-f5f7-44fb-b764-bd437afa2162\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6485e2cb7ee8fbf5bfa52c3a02061b8976537022f4edae11b72de319a3d1e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc21a6fc6f3c4dd1d96343a2aca98ac366c5fa1861bba36ad0fc20a725bcd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cb42ef3b68d9738fc8d634d18157993eecfa1b0449a1d186497d0250888d20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6390715efb2d185aaa4a3c25189efdab559335f2423b06aa2a4cfe45674adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc557ca536b61e8668049e98dd5eeacd4b581e0eea5799b9db6c41a7b3d4bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4fe3484f2b34c8eb0d25f08f0ccaa23418be745985af20177295c0a0b3e11fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f63b11ae8f6c9f55ddd68ee4a59afa3b0d6b394e4d0e183597052939bc9ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf77ff22055fe267f6bcbd9c70f5010ec9035770b3326aaea280aa250039a12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.185695 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719cbf22-1687-47cb-826d-490850b20e2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47559f3d1d2978d3efb6b3da597a76e58fd003e65ed8e6009174c7ba0214f1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d39cc55a552da63e8b9e464a650b2b82ff9984c2783363fc4d202e51f23cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26cf3f6e299bbd9fd9090c44f82572378de418f34d8c7d8a5150067f0db5124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://494e0c1d491d9c8af7d0e848443f5eb4281a93c183ad73535c2b46548c707879\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.202910 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31ba58c9e3c75d36f2e0fc7201528964f534460fcd71b1f0ff7e1d5e134a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.218652 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d9dfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9423545-b965-4de6-86b1-5af8bdf55a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76c38178fe7df9e26a3076417119b88480c491780e8e0bebb11b25fddd0a48f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6vx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d9dfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.239561 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.239665 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.239685 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.239750 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.239775 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.240518 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-thtgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a005aa-c7db-4d46-968b-8a9a0c00bbd5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:19:49Z\\\",\\\"message\\\":\\\"2025-10-02T11:19:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4\\\\n2025-10-02T11:19:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_582d8ad7-3fa7-4a87-a6cf-c253716a09c4 to /host/opt/cni/bin/\\\\n2025-10-02T11:19:04Z [verbose] multus-daemon started\\\\n2025-10-02T11:19:04Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:19:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:19:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ptmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-thtgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.258033 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nwq8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f23292d-4f7c-4850-bd3d-895a85ec5392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f29d02619874d87bfac5da84672127723bf928785afbe1188cf8c2afcb8261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbtx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nwq8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.275751 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb626c94-f2a6-40b1-8d2b-5331a0e41eea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf42b698bc2b3823a200dbf2c3e765a1b2d820224c532662a8c0eb820df64ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8445865b6a5fd3905888f3e8bc3d77123a4ec489a0d3d3c4498ca74843ab8cb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e38261f469658ddc62da917b4f9b42fea4473a41ed1549049661cffc8269cd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30688517c7885da46bb990699c63c6dd47b7a8f88437e0e6df00956078519c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13665ff5798b493e28c526120ad59029dac4839b5b0a44fe8f66e22401243a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:19:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 11:19:00.778960 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 11:19:00.779142 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:19:00.780216 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4281577435/tls.crt::/tmp/serving-cert-4281577435/tls.key\\\\\\\"\\\\nI1002 11:19:01.231689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:19:01.234650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:19:01.234674 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:19:01.234701 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:19:01.234709 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:19:01.243950 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:19:01.243999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:19:01.244023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:19:01.244030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:19:01.244037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:19:01.244044 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:19:01.244434 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:19:01.245796 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d96f22103cf3a22191c9e17ef8bfba4df1e34a66cab49681789e44fc7297243\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:18:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de2749519af709a0a4b5f35401def8769e604e3035260d036a3ead3c3772c6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:18:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.295852 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.309865 4658 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f01b099-f45d-4f2e-8e0d-e2e8b36d9384\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:19:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a6622046ab9224222978d4b89677c67c020bc7349e1bfb7f281b3e2abdf1b17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79872979225be04d6ad4c7ba46217e46f81677dbb8827ab2b288520afdbc7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrs78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:19:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2bqqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:20:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.343535 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.343606 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.343634 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.343668 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.343692 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.447079 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.447156 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.447177 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.447738 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.447795 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.552047 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.552104 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.552121 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.552145 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.552164 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.654887 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.654957 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.654976 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.655000 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.655019 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.758414 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.758491 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.758515 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.758553 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.758581 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.861658 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.861733 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.861759 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.861788 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.861810 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.948681 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:30 crc kubenswrapper[4658]: E1002 11:20:30.949263 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.949743 4658 scope.go:117] "RemoveContainer" containerID="8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023" Oct 02 11:20:30 crc kubenswrapper[4658]: E1002 11:20:30.950053 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.965349 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.965430 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.965456 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.965487 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:30 crc kubenswrapper[4658]: I1002 11:20:30.965510 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:30Z","lastTransitionTime":"2025-10-02T11:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.068725 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.068758 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.068768 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.068782 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.068793 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.171387 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.171452 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.171473 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.171499 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.171517 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.274645 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.274725 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.274753 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.274784 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.274808 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.377135 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.377247 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.377257 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.377271 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.377279 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.480089 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.480135 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.480151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.480168 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.480180 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.582615 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.582697 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.582720 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.582746 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.582765 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.686144 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.686205 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.686223 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.686249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.686265 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.789598 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.789642 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.789653 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.789670 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.789683 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.892005 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.892071 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.892091 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.892117 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.892136 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.948283 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.948360 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.948374 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:31 crc kubenswrapper[4658]: E1002 11:20:31.949279 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:31 crc kubenswrapper[4658]: E1002 11:20:31.949763 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:31 crc kubenswrapper[4658]: E1002 11:20:31.949916 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.995837 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.995881 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.995897 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.995918 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:31 crc kubenswrapper[4658]: I1002 11:20:31.995936 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:31Z","lastTransitionTime":"2025-10-02T11:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.099178 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.099245 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.099262 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.099291 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.099348 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:32Z","lastTransitionTime":"2025-10-02T11:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.202883 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.202944 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.202961 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.202986 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.203005 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:32Z","lastTransitionTime":"2025-10-02T11:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.306912 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.306985 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.307020 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.307053 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.307075 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:32Z","lastTransitionTime":"2025-10-02T11:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.409754 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.409850 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.409870 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.409892 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.409906 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:32Z","lastTransitionTime":"2025-10-02T11:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.512711 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.513139 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.513372 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.513582 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.513772 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:32Z","lastTransitionTime":"2025-10-02T11:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.616934 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.617034 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.617060 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.617095 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.617124 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:32Z","lastTransitionTime":"2025-10-02T11:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.720357 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.720431 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.720450 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.720482 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.720503 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:32Z","lastTransitionTime":"2025-10-02T11:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.823884 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.823977 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.824002 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.824031 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.824052 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:32Z","lastTransitionTime":"2025-10-02T11:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.927226 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.927285 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.927321 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.927344 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.927360 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:32Z","lastTransitionTime":"2025-10-02T11:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:32 crc kubenswrapper[4658]: I1002 11:20:32.948108 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:32 crc kubenswrapper[4658]: E1002 11:20:32.948888 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.030572 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.030626 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.030643 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.030695 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.030713 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.134331 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.134376 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.134391 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.134408 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.134421 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.237729 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.237795 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.237817 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.237839 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.237854 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.341003 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.341125 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.341151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.341184 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.341211 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.444547 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.444622 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.444645 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.444674 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.444697 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.548243 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.548367 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.548385 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.548412 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.548429 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.651968 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.652019 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.652031 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.652050 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.652063 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.755738 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.755809 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.755829 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.755861 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.755883 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.858678 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.858754 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.858774 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.858795 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.858815 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.948779 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.948779 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:33 crc kubenswrapper[4658]: E1002 11:20:33.948952 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:33 crc kubenswrapper[4658]: E1002 11:20:33.949047 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.948796 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:33 crc kubenswrapper[4658]: E1002 11:20:33.949227 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.961463 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.961515 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.961530 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.961552 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:33 crc kubenswrapper[4658]: I1002 11:20:33.961569 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:33Z","lastTransitionTime":"2025-10-02T11:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.064587 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.064655 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.064674 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.064700 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.064720 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.168264 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.168355 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.168372 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.168400 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.168423 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.271258 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.271327 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.271340 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.271359 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.271373 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.373656 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.373697 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.373709 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.373726 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.373738 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.478024 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.478173 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.478196 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.478249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.478279 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.585441 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.585538 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.585550 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.585571 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.585584 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.688199 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.688262 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.688283 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.688354 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.688379 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.791560 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.791640 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.791651 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.791665 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.791673 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.893948 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.893999 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.894010 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.894028 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.894037 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.948709 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:34 crc kubenswrapper[4658]: E1002 11:20:34.948952 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.997560 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.997627 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.997647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.997679 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:34 crc kubenswrapper[4658]: I1002 11:20:34.997701 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:34Z","lastTransitionTime":"2025-10-02T11:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.101429 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.101466 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.101478 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.101493 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.101503 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:35Z","lastTransitionTime":"2025-10-02T11:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.206805 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.206859 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.206869 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.206888 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.206899 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:35Z","lastTransitionTime":"2025-10-02T11:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.310533 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.310617 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.310646 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.310678 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.310699 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:35Z","lastTransitionTime":"2025-10-02T11:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.413634 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.413692 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.413709 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.413734 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.413753 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:35Z","lastTransitionTime":"2025-10-02T11:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.516852 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.516906 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.516918 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.516941 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.516958 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:35Z","lastTransitionTime":"2025-10-02T11:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.620013 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.620092 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.620115 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.620152 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.620175 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:35Z","lastTransitionTime":"2025-10-02T11:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.723708 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.723783 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.723803 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.723832 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.723852 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:35Z","lastTransitionTime":"2025-10-02T11:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.827092 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.827157 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.827181 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.827211 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.827231 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:35Z","lastTransitionTime":"2025-10-02T11:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.930541 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.930593 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.930611 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.930635 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.930653 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:35Z","lastTransitionTime":"2025-10-02T11:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.948408 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.948490 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:35 crc kubenswrapper[4658]: I1002 11:20:35.948676 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:35 crc kubenswrapper[4658]: E1002 11:20:35.948863 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:35 crc kubenswrapper[4658]: E1002 11:20:35.949101 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:35 crc kubenswrapper[4658]: E1002 11:20:35.949240 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.033172 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.033236 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.033249 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.033268 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.033285 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.137108 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.137184 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.137203 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.137274 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.137326 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.240478 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.240623 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.240658 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.240690 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.240713 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.343791 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.343845 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.343862 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.343886 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.343904 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.447067 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.447131 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.447148 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.447175 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.447192 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.550070 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.550130 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.550147 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.550175 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.550197 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.608779 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/1.log" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.609512 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/0.log" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.609592 4658 generic.go:334] "Generic (PLEG): container finished" podID="69a005aa-c7db-4d46-968b-8a9a0c00bbd5" containerID="96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5" exitCode=1 Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.609643 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-thtgx" event={"ID":"69a005aa-c7db-4d46-968b-8a9a0c00bbd5","Type":"ContainerDied","Data":"96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.609701 4658 scope.go:117] "RemoveContainer" containerID="fe71b17699723a0c9da6d7b576013b285706ad897bd432be749304d594f5d385" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.611207 4658 scope.go:117] "RemoveContainer" containerID="96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5" Oct 02 11:20:36 crc kubenswrapper[4658]: E1002 11:20:36.612114 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-thtgx_openshift-multus(69a005aa-c7db-4d46-968b-8a9a0c00bbd5)\"" pod="openshift-multus/multus-thtgx" podUID="69a005aa-c7db-4d46-968b-8a9a0c00bbd5" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.642075 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=46.642054248 podStartE2EDuration="46.642054248s" podCreationTimestamp="2025-10-02 11:19:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.641652884 +0000 UTC m=+117.532806551" watchObservedRunningTime="2025-10-02 11:20:36.642054248 +0000 UTC m=+117.533207855" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.655944 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.656001 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.656020 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.656046 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.656066 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.670730 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=95.670620238 podStartE2EDuration="1m35.670620238s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.664508342 +0000 UTC m=+117.555661919" watchObservedRunningTime="2025-10-02 11:20:36.670620238 +0000 UTC m=+117.561773815" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.758535 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.758872 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.758882 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.758900 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.758916 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.776605 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=93.776576976 podStartE2EDuration="1m33.776576976s" podCreationTimestamp="2025-10-02 11:19:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.775620275 +0000 UTC m=+117.666773862" watchObservedRunningTime="2025-10-02 11:20:36.776576976 +0000 UTC m=+117.667730543" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.815961 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=62.815932395 podStartE2EDuration="1m2.815932395s" podCreationTimestamp="2025-10-02 11:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.793643746 +0000 UTC m=+117.684797313" watchObservedRunningTime="2025-10-02 11:20:36.815932395 +0000 UTC m=+117.707085972" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.854533 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d9dfl" podStartSLOduration=95.85451428 podStartE2EDuration="1m35.85451428s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.840553479 +0000 UTC m=+117.731707046" watchObservedRunningTime="2025-10-02 11:20:36.85451428 +0000 UTC m=+117.745667847" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.861567 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.861619 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.861631 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.861646 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.861656 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.867116 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nwq8l" podStartSLOduration=95.867104975 podStartE2EDuration="1m35.867104975s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.866348371 +0000 UTC m=+117.757501938" watchObservedRunningTime="2025-10-02 11:20:36.867104975 +0000 UTC m=+117.758258542" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.887917 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=94.887887306 podStartE2EDuration="1m34.887887306s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.887334538 +0000 UTC m=+117.778488125" watchObservedRunningTime="2025-10-02 11:20:36.887887306 +0000 UTC m=+117.779040883" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.939257 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2bqqx" podStartSLOduration=94.939235221 podStartE2EDuration="1m34.939235221s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.92151267 +0000 UTC m=+117.812666247" watchObservedRunningTime="2025-10-02 11:20:36.939235221 +0000 UTC m=+117.830388808" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.948740 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:36 crc kubenswrapper[4658]: E1002 11:20:36.949101 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.956492 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podStartSLOduration=95.956435196 podStartE2EDuration="1m35.956435196s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.95528668 +0000 UTC m=+117.846440247" watchObservedRunningTime="2025-10-02 11:20:36.956435196 +0000 UTC m=+117.847588773" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.963790 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.963849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.963863 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.963884 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.963897 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:36Z","lastTransitionTime":"2025-10-02T11:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:36 crc kubenswrapper[4658]: I1002 11:20:36.976488 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fnfts" podStartSLOduration=95.976469442 podStartE2EDuration="1m35.976469442s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:36.975583483 +0000 UTC m=+117.866737070" watchObservedRunningTime="2025-10-02 11:20:36.976469442 +0000 UTC m=+117.867623009" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.066518 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.066553 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.066565 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.066579 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.066588 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.170534 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.170600 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.170623 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.170647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.170665 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.274203 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.274272 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.274288 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.274362 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.274391 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.377555 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.377612 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.377627 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.377650 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.377667 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.480329 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.480397 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.480415 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.480447 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.480473 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.583390 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.583460 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.583480 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.583505 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.583522 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.616484 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/1.log" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.686581 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.686653 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.686674 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.686714 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.686750 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.789849 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.789897 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.789909 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.789928 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.789938 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.892885 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.892939 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.892958 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.892974 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.892984 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.948607 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.948693 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:37 crc kubenswrapper[4658]: E1002 11:20:37.948801 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.948933 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:37 crc kubenswrapper[4658]: E1002 11:20:37.949160 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:37 crc kubenswrapper[4658]: E1002 11:20:37.949274 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.996072 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.996151 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.996171 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.996200 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:37 crc kubenswrapper[4658]: I1002 11:20:37.996221 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:37Z","lastTransitionTime":"2025-10-02T11:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.098342 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.098410 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.098423 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.098443 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.098455 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:38Z","lastTransitionTime":"2025-10-02T11:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.201370 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.201457 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.201475 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.201495 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.201508 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:38Z","lastTransitionTime":"2025-10-02T11:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.305089 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.305157 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.305174 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.305201 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.305219 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:38Z","lastTransitionTime":"2025-10-02T11:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.408411 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.408449 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.408459 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.408475 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.408486 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:38Z","lastTransitionTime":"2025-10-02T11:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.511445 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.511490 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.511501 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.511518 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.511530 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:38Z","lastTransitionTime":"2025-10-02T11:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.615475 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.615566 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.615592 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.615616 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.615633 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:38Z","lastTransitionTime":"2025-10-02T11:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.717573 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.717629 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.717647 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.717671 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.717689 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:38Z","lastTransitionTime":"2025-10-02T11:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.820863 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.820932 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.820951 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.820977 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.820996 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:38Z","lastTransitionTime":"2025-10-02T11:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.923970 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.924052 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.924077 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.924110 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.924132 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:38Z","lastTransitionTime":"2025-10-02T11:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:38 crc kubenswrapper[4658]: I1002 11:20:38.948399 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:38 crc kubenswrapper[4658]: E1002 11:20:38.948578 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.027672 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.027712 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.027721 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.027737 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.027746 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:39Z","lastTransitionTime":"2025-10-02T11:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.130687 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.130721 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.130729 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.130743 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.130752 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:39Z","lastTransitionTime":"2025-10-02T11:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.233136 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.233203 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.233221 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.233246 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.233264 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:39Z","lastTransitionTime":"2025-10-02T11:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.336378 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.336423 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.336434 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.336450 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.336461 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:39Z","lastTransitionTime":"2025-10-02T11:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.439812 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.439856 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.439877 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.439928 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.439951 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:39Z","lastTransitionTime":"2025-10-02T11:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.543582 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.543742 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.543771 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.543799 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.543822 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:39Z","lastTransitionTime":"2025-10-02T11:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.647228 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.647316 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.647350 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.647369 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.647384 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:39Z","lastTransitionTime":"2025-10-02T11:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.673833 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.673882 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.673893 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.673910 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.673922 4658 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:20:39Z","lastTransitionTime":"2025-10-02T11:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.734423 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f"] Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.735018 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.738959 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.739022 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.739071 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.739642 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.775348 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/766c234f-492a-4539-98cf-ee8965435860-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.775443 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/766c234f-492a-4539-98cf-ee8965435860-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.775506 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/766c234f-492a-4539-98cf-ee8965435860-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.775591 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/766c234f-492a-4539-98cf-ee8965435860-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.775679 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/766c234f-492a-4539-98cf-ee8965435860-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.877342 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/766c234f-492a-4539-98cf-ee8965435860-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.877406 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/766c234f-492a-4539-98cf-ee8965435860-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.877437 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/766c234f-492a-4539-98cf-ee8965435860-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.877477 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/766c234f-492a-4539-98cf-ee8965435860-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.877524 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/766c234f-492a-4539-98cf-ee8965435860-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.877624 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/766c234f-492a-4539-98cf-ee8965435860-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.878683 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/766c234f-492a-4539-98cf-ee8965435860-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.881130 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.881903 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.892106 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/766c234f-492a-4539-98cf-ee8965435860-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.899023 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/766c234f-492a-4539-98cf-ee8965435860-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.904055 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.918236 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/766c234f-492a-4539-98cf-ee8965435860-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tvc4f\" (UID: \"766c234f-492a-4539-98cf-ee8965435860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.948571 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.948594 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:39 crc kubenswrapper[4658]: I1002 11:20:39.948693 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:39 crc kubenswrapper[4658]: E1002 11:20:39.949662 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:39 crc kubenswrapper[4658]: E1002 11:20:39.949845 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:39 crc kubenswrapper[4658]: E1002 11:20:39.949959 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:39 crc kubenswrapper[4658]: E1002 11:20:39.953634 4658 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 02 11:20:40 crc kubenswrapper[4658]: I1002 11:20:40.061316 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 11:20:40 crc kubenswrapper[4658]: I1002 11:20:40.069867 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" Oct 02 11:20:40 crc kubenswrapper[4658]: E1002 11:20:40.383439 4658 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:20:40 crc kubenswrapper[4658]: I1002 11:20:40.626576 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" event={"ID":"766c234f-492a-4539-98cf-ee8965435860","Type":"ContainerStarted","Data":"50ed9e6898d05d40a4473a4dadaa14049b8625d7eeec8411fb9b1b873b0617ba"} Oct 02 11:20:40 crc kubenswrapper[4658]: I1002 11:20:40.626661 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" event={"ID":"766c234f-492a-4539-98cf-ee8965435860","Type":"ContainerStarted","Data":"f0e9a7b44ac5d2541ef4c848b480fdf21a836dcecb80002c1bf693bfa884df69"} Oct 02 11:20:40 crc kubenswrapper[4658]: I1002 11:20:40.949050 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:40 crc kubenswrapper[4658]: E1002 11:20:40.949196 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:41 crc kubenswrapper[4658]: I1002 11:20:41.949033 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:41 crc kubenswrapper[4658]: I1002 11:20:41.949103 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:41 crc kubenswrapper[4658]: I1002 11:20:41.949062 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:41 crc kubenswrapper[4658]: E1002 11:20:41.949349 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:41 crc kubenswrapper[4658]: E1002 11:20:41.949429 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:41 crc kubenswrapper[4658]: E1002 11:20:41.949517 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:42 crc kubenswrapper[4658]: I1002 11:20:42.948524 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:42 crc kubenswrapper[4658]: E1002 11:20:42.948745 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:42 crc kubenswrapper[4658]: I1002 11:20:42.949642 4658 scope.go:117] "RemoveContainer" containerID="8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023" Oct 02 11:20:42 crc kubenswrapper[4658]: E1002 11:20:42.949987 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" Oct 02 11:20:43 crc kubenswrapper[4658]: I1002 11:20:43.949115 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:43 crc kubenswrapper[4658]: I1002 11:20:43.949232 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:43 crc kubenswrapper[4658]: E1002 11:20:43.949318 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:43 crc kubenswrapper[4658]: I1002 11:20:43.949354 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:43 crc kubenswrapper[4658]: E1002 11:20:43.949461 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:43 crc kubenswrapper[4658]: E1002 11:20:43.949823 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:44 crc kubenswrapper[4658]: I1002 11:20:44.949029 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:44 crc kubenswrapper[4658]: E1002 11:20:44.949222 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:45 crc kubenswrapper[4658]: E1002 11:20:45.385101 4658 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:20:45 crc kubenswrapper[4658]: I1002 11:20:45.949139 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:45 crc kubenswrapper[4658]: I1002 11:20:45.949270 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:45 crc kubenswrapper[4658]: I1002 11:20:45.949344 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:45 crc kubenswrapper[4658]: E1002 11:20:45.949708 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:45 crc kubenswrapper[4658]: E1002 11:20:45.949860 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:45 crc kubenswrapper[4658]: E1002 11:20:45.950007 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:46 crc kubenswrapper[4658]: I1002 11:20:46.949121 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:46 crc kubenswrapper[4658]: E1002 11:20:46.949332 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:47 crc kubenswrapper[4658]: I1002 11:20:47.949248 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:47 crc kubenswrapper[4658]: I1002 11:20:47.949289 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:47 crc kubenswrapper[4658]: E1002 11:20:47.949537 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:47 crc kubenswrapper[4658]: I1002 11:20:47.949329 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:47 crc kubenswrapper[4658]: E1002 11:20:47.949662 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:47 crc kubenswrapper[4658]: E1002 11:20:47.949800 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:48 crc kubenswrapper[4658]: I1002 11:20:48.948111 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:48 crc kubenswrapper[4658]: E1002 11:20:48.948289 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:49 crc kubenswrapper[4658]: I1002 11:20:49.948409 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:49 crc kubenswrapper[4658]: I1002 11:20:49.948480 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:49 crc kubenswrapper[4658]: E1002 11:20:49.951423 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:49 crc kubenswrapper[4658]: I1002 11:20:49.951477 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:49 crc kubenswrapper[4658]: E1002 11:20:49.951622 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:49 crc kubenswrapper[4658]: E1002 11:20:49.951797 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:50 crc kubenswrapper[4658]: E1002 11:20:50.385897 4658 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:20:50 crc kubenswrapper[4658]: I1002 11:20:50.948419 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:50 crc kubenswrapper[4658]: E1002 11:20:50.948707 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:51 crc kubenswrapper[4658]: I1002 11:20:51.948794 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:51 crc kubenswrapper[4658]: E1002 11:20:51.949773 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:51 crc kubenswrapper[4658]: I1002 11:20:51.949066 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:51 crc kubenswrapper[4658]: E1002 11:20:51.949970 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:51 crc kubenswrapper[4658]: I1002 11:20:51.948985 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:51 crc kubenswrapper[4658]: I1002 11:20:51.949200 4658 scope.go:117] "RemoveContainer" containerID="96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5" Oct 02 11:20:51 crc kubenswrapper[4658]: E1002 11:20:51.950122 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:51 crc kubenswrapper[4658]: I1002 11:20:51.979363 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tvc4f" podStartSLOduration=110.979332806 podStartE2EDuration="1m50.979332806s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:40.645611054 +0000 UTC m=+121.536764671" watchObservedRunningTime="2025-10-02 11:20:51.979332806 +0000 UTC m=+132.870486413" Oct 02 11:20:52 crc kubenswrapper[4658]: I1002 11:20:52.675910 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/1.log" Oct 02 11:20:52 crc kubenswrapper[4658]: I1002 11:20:52.675991 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-thtgx" event={"ID":"69a005aa-c7db-4d46-968b-8a9a0c00bbd5","Type":"ContainerStarted","Data":"f04b87c43afe012e11419112bd1a2b96826666a7720fc6cef90e8211df145006"} Oct 02 11:20:52 crc kubenswrapper[4658]: I1002 11:20:52.698470 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-thtgx" podStartSLOduration=111.698435191 podStartE2EDuration="1m51.698435191s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:20:52.697312924 +0000 UTC m=+133.588466551" watchObservedRunningTime="2025-10-02 11:20:52.698435191 +0000 UTC m=+133.589588798" Oct 02 11:20:52 crc kubenswrapper[4658]: I1002 11:20:52.949124 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:52 crc kubenswrapper[4658]: E1002 11:20:52.949348 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:53 crc kubenswrapper[4658]: I1002 11:20:53.948556 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:53 crc kubenswrapper[4658]: I1002 11:20:53.948606 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:53 crc kubenswrapper[4658]: I1002 11:20:53.948606 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:53 crc kubenswrapper[4658]: E1002 11:20:53.948826 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:53 crc kubenswrapper[4658]: E1002 11:20:53.948928 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:53 crc kubenswrapper[4658]: E1002 11:20:53.949011 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:54 crc kubenswrapper[4658]: I1002 11:20:54.948738 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:54 crc kubenswrapper[4658]: E1002 11:20:54.948948 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:54 crc kubenswrapper[4658]: I1002 11:20:54.949964 4658 scope.go:117] "RemoveContainer" containerID="8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023" Oct 02 11:20:54 crc kubenswrapper[4658]: E1002 11:20:54.950214 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2t8w8_openshift-ovn-kubernetes(dea12458-2637-446e-b388-4f139b3fd000)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" Oct 02 11:20:55 crc kubenswrapper[4658]: E1002 11:20:55.387264 4658 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:20:55 crc kubenswrapper[4658]: I1002 11:20:55.948429 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:55 crc kubenswrapper[4658]: I1002 11:20:55.948536 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:55 crc kubenswrapper[4658]: E1002 11:20:55.948636 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:55 crc kubenswrapper[4658]: E1002 11:20:55.948733 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:55 crc kubenswrapper[4658]: I1002 11:20:55.949120 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:55 crc kubenswrapper[4658]: E1002 11:20:55.949350 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:56 crc kubenswrapper[4658]: I1002 11:20:56.948960 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:56 crc kubenswrapper[4658]: E1002 11:20:56.949124 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:57 crc kubenswrapper[4658]: I1002 11:20:57.948163 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:57 crc kubenswrapper[4658]: I1002 11:20:57.948238 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:57 crc kubenswrapper[4658]: I1002 11:20:57.948170 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:57 crc kubenswrapper[4658]: E1002 11:20:57.948464 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:57 crc kubenswrapper[4658]: E1002 11:20:57.948809 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:57 crc kubenswrapper[4658]: E1002 11:20:57.949111 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:20:58 crc kubenswrapper[4658]: I1002 11:20:58.948507 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:20:58 crc kubenswrapper[4658]: E1002 11:20:58.948715 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:20:59 crc kubenswrapper[4658]: I1002 11:20:59.948610 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:20:59 crc kubenswrapper[4658]: I1002 11:20:59.948709 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:20:59 crc kubenswrapper[4658]: E1002 11:20:59.951128 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:20:59 crc kubenswrapper[4658]: I1002 11:20:59.951180 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:20:59 crc kubenswrapper[4658]: E1002 11:20:59.951660 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:20:59 crc kubenswrapper[4658]: E1002 11:20:59.951798 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:21:00 crc kubenswrapper[4658]: E1002 11:21:00.388132 4658 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:21:00 crc kubenswrapper[4658]: I1002 11:21:00.948590 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:21:00 crc kubenswrapper[4658]: E1002 11:21:00.948817 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:21:01 crc kubenswrapper[4658]: I1002 11:21:01.948552 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:21:01 crc kubenswrapper[4658]: I1002 11:21:01.948651 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:21:01 crc kubenswrapper[4658]: E1002 11:21:01.948742 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:21:01 crc kubenswrapper[4658]: I1002 11:21:01.948775 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:21:01 crc kubenswrapper[4658]: E1002 11:21:01.948843 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:21:01 crc kubenswrapper[4658]: E1002 11:21:01.948960 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:21:02 crc kubenswrapper[4658]: I1002 11:21:02.948115 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:21:02 crc kubenswrapper[4658]: E1002 11:21:02.948284 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:21:03 crc kubenswrapper[4658]: I1002 11:21:03.948741 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:21:03 crc kubenswrapper[4658]: I1002 11:21:03.948809 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:21:03 crc kubenswrapper[4658]: I1002 11:21:03.948866 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:21:03 crc kubenswrapper[4658]: E1002 11:21:03.948919 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:21:03 crc kubenswrapper[4658]: E1002 11:21:03.949028 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:21:03 crc kubenswrapper[4658]: E1002 11:21:03.949111 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:21:04 crc kubenswrapper[4658]: I1002 11:21:04.948404 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:21:04 crc kubenswrapper[4658]: E1002 11:21:04.948598 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:21:05 crc kubenswrapper[4658]: E1002 11:21:05.389432 4658 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:21:05 crc kubenswrapper[4658]: I1002 11:21:05.949103 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:21:05 crc kubenswrapper[4658]: I1002 11:21:05.949190 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:21:05 crc kubenswrapper[4658]: I1002 11:21:05.949141 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:21:05 crc kubenswrapper[4658]: E1002 11:21:05.949416 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:21:05 crc kubenswrapper[4658]: E1002 11:21:05.949693 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:21:05 crc kubenswrapper[4658]: E1002 11:21:05.949822 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:21:06 crc kubenswrapper[4658]: I1002 11:21:06.949369 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:21:06 crc kubenswrapper[4658]: E1002 11:21:06.950314 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:21:06 crc kubenswrapper[4658]: I1002 11:21:06.950541 4658 scope.go:117] "RemoveContainer" containerID="8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023" Oct 02 11:21:07 crc kubenswrapper[4658]: I1002 11:21:07.731614 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/3.log" Oct 02 11:21:07 crc kubenswrapper[4658]: I1002 11:21:07.735014 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerStarted","Data":"6a0357622298d3d3fe3388b77219e229e40bef5c13d2a18b87c4c843459c761d"} Oct 02 11:21:07 crc kubenswrapper[4658]: I1002 11:21:07.735464 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:21:07 crc kubenswrapper[4658]: I1002 11:21:07.763655 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podStartSLOduration=126.763638507 podStartE2EDuration="2m6.763638507s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:07.762969784 +0000 UTC m=+148.654123361" watchObservedRunningTime="2025-10-02 11:21:07.763638507 +0000 UTC m=+148.654792084" Oct 02 11:21:07 crc kubenswrapper[4658]: I1002 11:21:07.813177 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6fxls"] Oct 02 11:21:07 crc kubenswrapper[4658]: I1002 11:21:07.813347 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:21:07 crc kubenswrapper[4658]: E1002 11:21:07.813468 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:21:07 crc kubenswrapper[4658]: I1002 11:21:07.948273 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:21:07 crc kubenswrapper[4658]: I1002 11:21:07.948273 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:21:07 crc kubenswrapper[4658]: E1002 11:21:07.948435 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:21:07 crc kubenswrapper[4658]: E1002 11:21:07.948499 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:21:08 crc kubenswrapper[4658]: I1002 11:21:08.847086 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:08 crc kubenswrapper[4658]: I1002 11:21:08.847396 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:21:08 crc kubenswrapper[4658]: I1002 11:21:08.847465 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:21:08 crc kubenswrapper[4658]: I1002 11:21:08.847524 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:21:08 crc kubenswrapper[4658]: I1002 11:21:08.847580 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847666 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847726 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847725 4658 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847755 4658 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847821 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847855 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:23:10.847819954 +0000 UTC m=+271.738973561 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847866 4658 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847888 4658 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847896 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:23:10.847878126 +0000 UTC m=+271.739031733 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847918 4658 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.847972 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:23:10.847944599 +0000 UTC m=+271.739098286 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.848045 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:23:10.848013961 +0000 UTC m=+271.739167608 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.848820 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:23:10.848803307 +0000 UTC m=+271.739956984 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:08 crc kubenswrapper[4658]: I1002 11:21:08.948461 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:21:08 crc kubenswrapper[4658]: E1002 11:21:08.948629 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:21:09 crc kubenswrapper[4658]: I1002 11:21:09.948684 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:21:09 crc kubenswrapper[4658]: I1002 11:21:09.948771 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:21:09 crc kubenswrapper[4658]: E1002 11:21:09.950781 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fxls" podUID="2ea83baf-570c-46db-ad98-aa9ec89d1c82" Oct 02 11:21:09 crc kubenswrapper[4658]: I1002 11:21:09.950846 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:21:09 crc kubenswrapper[4658]: E1002 11:21:09.951019 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:21:09 crc kubenswrapper[4658]: E1002 11:21:09.951167 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.397008 4658 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.425876 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xbkft"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.426533 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.442195 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.442233 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.442537 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.442697 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.442823 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.442902 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.442830 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.442992 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.443141 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.443242 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.448048 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.452137 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gjt96"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.452943 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.453195 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-j267v"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.459351 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cmjlm"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.459924 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.459967 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.460181 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.460479 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.460893 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.461094 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.461135 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.462963 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.463814 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bfa1953c-4c82-4463-b772-6b871bcea9b8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.463867 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.463869 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-audit\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.463951 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lf7x\" (UniqueName: \"kubernetes.io/projected/bfa1953c-4c82-4463-b772-6b871bcea9b8-kube-api-access-8lf7x\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.463977 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93963d75-dbb2-414c-9218-aee78bb8f819-etcd-client\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.463996 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464009 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd75n\" (UniqueName: \"kubernetes.io/projected/dd736d13-0140-458a-bbdf-bed6d2e55ce1-kube-api-access-jd75n\") pod \"openshift-config-operator-7777fb866f-j267v\" (UID: \"dd736d13-0140-458a-bbdf-bed6d2e55ce1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464035 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/93963d75-dbb2-414c-9218-aee78bb8f819-node-pullsecrets\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464060 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93963d75-dbb2-414c-9218-aee78bb8f819-serving-cert\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464084 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/93963d75-dbb2-414c-9218-aee78bb8f819-encryption-config\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464107 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd736d13-0140-458a-bbdf-bed6d2e55ce1-serving-cert\") pod \"openshift-config-operator-7777fb866f-j267v\" (UID: \"dd736d13-0140-458a-bbdf-bed6d2e55ce1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464142 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-config\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464163 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93963d75-dbb2-414c-9218-aee78bb8f819-audit-dir\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464188 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-image-import-ca\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464213 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-etcd-serving-ca\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464240 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/dd736d13-0140-458a-bbdf-bed6d2e55ce1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-j267v\" (UID: \"dd736d13-0140-458a-bbdf-bed6d2e55ce1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464267 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cktm\" (UniqueName: \"kubernetes.io/projected/93963d75-dbb2-414c-9218-aee78bb8f819-kube-api-access-9cktm\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464290 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bfa1953c-4c82-4463-b772-6b871bcea9b8-images\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464338 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa1953c-4c82-4463-b772-6b871bcea9b8-config\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464371 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.464970 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.465062 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.465193 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.465368 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-w7rrv"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.465455 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.465634 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.465831 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.465945 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.465895 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.466494 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w7rrv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.467235 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.469340 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvclq"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.469862 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.470226 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.470604 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.470710 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.471491 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.473418 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v4m9t"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.473697 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.474151 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lhc27"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.474726 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.486861 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.489341 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-md7fr"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.490241 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.492651 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.492969 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.493172 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.493415 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-swmbd"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.506793 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.510586 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.510631 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.510757 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.510826 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.510976 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.510995 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511077 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511092 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511179 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511192 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511256 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511273 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511363 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511386 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511454 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511479 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511532 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511535 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511575 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511598 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511661 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511671 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511803 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.510603 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511928 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.511975 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512042 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512095 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512124 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512198 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512237 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512323 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512366 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512456 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512504 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512595 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512633 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512786 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512896 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512957 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513014 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513039 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513054 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.512600 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513124 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513135 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513188 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513210 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513246 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513327 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513338 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513195 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513435 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513452 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.513512 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-47rr5"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.514154 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.515080 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5lrcf"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.515392 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.516370 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.516388 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.517657 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.517749 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.517773 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.517859 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.517876 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.517991 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.518080 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.518456 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.518815 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.519434 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.519561 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.520035 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.520201 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.520118 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.520658 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.520077 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.520119 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.522149 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.524645 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.525706 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.525865 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.526654 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.527162 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.528244 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.534243 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.536410 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.538875 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.539260 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.539261 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.549933 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.550173 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.564320 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.566025 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.566142 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lf7x\" (UniqueName: \"kubernetes.io/projected/bfa1953c-4c82-4463-b772-6b871bcea9b8-kube-api-access-8lf7x\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.566184 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93963d75-dbb2-414c-9218-aee78bb8f819-etcd-client\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.566225 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd75n\" (UniqueName: \"kubernetes.io/projected/dd736d13-0140-458a-bbdf-bed6d2e55ce1-kube-api-access-jd75n\") pod \"openshift-config-operator-7777fb866f-j267v\" (UID: \"dd736d13-0140-458a-bbdf-bed6d2e55ce1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.566262 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/93963d75-dbb2-414c-9218-aee78bb8f819-node-pullsecrets\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.566335 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93963d75-dbb2-414c-9218-aee78bb8f819-serving-cert\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.566412 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd736d13-0140-458a-bbdf-bed6d2e55ce1-serving-cert\") pod \"openshift-config-operator-7777fb866f-j267v\" (UID: \"dd736d13-0140-458a-bbdf-bed6d2e55ce1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.566892 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/93963d75-dbb2-414c-9218-aee78bb8f819-encryption-config\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.567447 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-config\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585481 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93963d75-dbb2-414c-9218-aee78bb8f819-audit-dir\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585510 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-image-import-ca\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585531 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-etcd-serving-ca\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585553 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/dd736d13-0140-458a-bbdf-bed6d2e55ce1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-j267v\" (UID: \"dd736d13-0140-458a-bbdf-bed6d2e55ce1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585578 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bfa1953c-4c82-4463-b772-6b871bcea9b8-images\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585593 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cktm\" (UniqueName: \"kubernetes.io/projected/93963d75-dbb2-414c-9218-aee78bb8f819-kube-api-access-9cktm\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585612 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa1953c-4c82-4463-b772-6b871bcea9b8-config\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585650 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585711 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bfa1953c-4c82-4463-b772-6b871bcea9b8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.585730 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-audit\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.586447 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-audit\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.586891 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-config\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.587646 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.587802 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93963d75-dbb2-414c-9218-aee78bb8f819-audit-dir\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.587923 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.588570 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.588777 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93963d75-dbb2-414c-9218-aee78bb8f819-etcd-client\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.589025 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.589097 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.589336 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.589586 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.590322 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/93963d75-dbb2-414c-9218-aee78bb8f819-encryption-config\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.590353 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.590612 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-image-import-ca\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.591073 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-etcd-serving-ca\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.591351 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/dd736d13-0140-458a-bbdf-bed6d2e55ce1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-j267v\" (UID: \"dd736d13-0140-458a-bbdf-bed6d2e55ce1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.591453 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/93963d75-dbb2-414c-9218-aee78bb8f819-node-pullsecrets\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.591549 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.591782 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.591933 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bfa1953c-4c82-4463-b772-6b871bcea9b8-images\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.592305 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93963d75-dbb2-414c-9218-aee78bb8f819-serving-cert\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.593118 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa1953c-4c82-4463-b772-6b871bcea9b8-config\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.593635 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tgmnk"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.593812 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.594256 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93963d75-dbb2-414c-9218-aee78bb8f819-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.594555 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.594843 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.595102 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.595874 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.596366 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd736d13-0140-458a-bbdf-bed6d2e55ce1-serving-cert\") pod \"openshift-config-operator-7777fb866f-j267v\" (UID: \"dd736d13-0140-458a-bbdf-bed6d2e55ce1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.597688 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.598578 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.598919 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bfa1953c-4c82-4463-b772-6b871bcea9b8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.599256 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q5bt5"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.599977 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.600325 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.602222 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.602250 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m5pf4"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.603015 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.603355 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.603753 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.604193 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.604602 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.605085 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.605885 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.606009 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-g4bk2"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.606552 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.606989 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l6vlk"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.607802 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.607923 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.609286 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xbkft"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.612138 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cmjlm"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.612185 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gjt96"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.612196 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.612461 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.614119 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.616819 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w7rrv"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.616848 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-md7fr"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.616857 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.619971 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvclq"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.619998 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.620008 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5lrcf"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.623078 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-47rr5"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.623099 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-j267v"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.623107 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v4m9t"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.624453 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.625214 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.626279 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.629832 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.630087 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.630133 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.631055 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lhc27"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.632243 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tgmnk"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.636623 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.638552 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.640474 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-swmbd"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.641764 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hmn5s"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.648654 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8xwb5"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.649079 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.649636 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664778 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664828 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q5bt5"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664844 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664857 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664872 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m5pf4"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664886 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664898 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664910 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664922 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664934 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8xwb5"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664944 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.664957 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.665053 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8xwb5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.668462 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l6vlk"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.669844 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.669980 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.670760 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kbq7v"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.677689 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kbq7v"] Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.677905 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686348 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwrww\" (UniqueName: \"kubernetes.io/projected/ebab7917-b306-46a8-8dc7-f99b4b162c71-kube-api-access-jwrww\") pod \"openshift-apiserver-operator-796bbdcf4f-4mdfm\" (UID: \"ebab7917-b306-46a8-8dc7-f99b4b162c71\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686388 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98wx\" (UniqueName: \"kubernetes.io/projected/4082750e-cf12-45b4-8920-63f31ad1cc28-kube-api-access-m98wx\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686414 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a583e679-8e90-4f82-b286-3eda40831c72-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686439 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b161a36-8654-4948-8412-bb68940fe512-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4wktl\" (UID: \"2b161a36-8654-4948-8412-bb68940fe512\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686464 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvwp5\" (UniqueName: \"kubernetes.io/projected/0b1493cb-71c1-4283-9c81-b73014189a60-kube-api-access-vvwp5\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686482 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-console-config\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686498 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-trusted-ca-bundle\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686513 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b1493cb-71c1-4283-9c81-b73014189a60-config\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686537 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fblmc\" (UniqueName: \"kubernetes.io/projected/2b161a36-8654-4948-8412-bb68940fe512-kube-api-access-fblmc\") pod \"control-plane-machine-set-operator-78cbb6b69f-4wktl\" (UID: \"2b161a36-8654-4948-8412-bb68940fe512\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686558 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f36aa73-2c64-431b-8991-37312e054756-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686582 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/17feaa75-00bd-4b47-a857-5fa5b27427fb-etcd-ca\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686599 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a583e679-8e90-4f82-b286-3eda40831c72-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686624 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a583e679-8e90-4f82-b286-3eda40831c72-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686642 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz4t6\" (UniqueName: \"kubernetes.io/projected/2670838d-90c6-490a-a620-676073872108-kube-api-access-pz4t6\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686657 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebab7917-b306-46a8-8dc7-f99b4b162c71-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4mdfm\" (UID: \"ebab7917-b306-46a8-8dc7-f99b4b162c71\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686689 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-serving-cert\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686793 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2670838d-90c6-490a-a620-676073872108-machine-approver-tls\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686891 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebab7917-b306-46a8-8dc7-f99b4b162c71-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4mdfm\" (UID: \"ebab7917-b306-46a8-8dc7-f99b4b162c71\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686929 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-oauth-config\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.686961 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-service-ca\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687014 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17feaa75-00bd-4b47-a857-5fa5b27427fb-serving-cert\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687047 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/17feaa75-00bd-4b47-a857-5fa5b27427fb-etcd-service-ca\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687065 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f36aa73-2c64-431b-8991-37312e054756-trusted-ca\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687092 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2670838d-90c6-490a-a620-676073872108-config\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687114 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2ng8\" (UniqueName: \"kubernetes.io/projected/34322326-016a-4e58-b14c-680c8cc94dbb-kube-api-access-v2ng8\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgmpv\" (UID: \"34322326-016a-4e58-b14c-680c8cc94dbb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687142 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rjf2\" (UniqueName: \"kubernetes.io/projected/4f36aa73-2c64-431b-8991-37312e054756-kube-api-access-9rjf2\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687163 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f36aa73-2c64-431b-8991-37312e054756-metrics-tls\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687230 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlzhx\" (UniqueName: \"kubernetes.io/projected/c670b59a-b4ec-4332-9a76-72fee4666277-kube-api-access-qlzhx\") pod \"downloads-7954f5f757-w7rrv\" (UID: \"c670b59a-b4ec-4332-9a76-72fee4666277\") " pod="openshift-console/downloads-7954f5f757-w7rrv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687269 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b1493cb-71c1-4283-9c81-b73014189a60-serving-cert\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687312 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b1493cb-71c1-4283-9c81-b73014189a60-trusted-ca\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687337 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17feaa75-00bd-4b47-a857-5fa5b27427fb-config\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687358 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqjs8\" (UniqueName: \"kubernetes.io/projected/17feaa75-00bd-4b47-a857-5fa5b27427fb-kube-api-access-dqjs8\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687388 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34322326-016a-4e58-b14c-680c8cc94dbb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgmpv\" (UID: \"34322326-016a-4e58-b14c-680c8cc94dbb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687415 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34322326-016a-4e58-b14c-680c8cc94dbb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgmpv\" (UID: \"34322326-016a-4e58-b14c-680c8cc94dbb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687444 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-oauth-serving-cert\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687559 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17feaa75-00bd-4b47-a857-5fa5b27427fb-etcd-client\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687776 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5lg6\" (UniqueName: \"kubernetes.io/projected/a583e679-8e90-4f82-b286-3eda40831c72-kube-api-access-x5lg6\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.687802 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2670838d-90c6-490a-a620-676073872108-auth-proxy-config\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.707659 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.711090 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.728859 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.748535 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.769632 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788363 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17feaa75-00bd-4b47-a857-5fa5b27427fb-serving-cert\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788405 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/17feaa75-00bd-4b47-a857-5fa5b27427fb-etcd-service-ca\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788427 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f36aa73-2c64-431b-8991-37312e054756-trusted-ca\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788701 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2670838d-90c6-490a-a620-676073872108-config\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788734 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2ng8\" (UniqueName: \"kubernetes.io/projected/34322326-016a-4e58-b14c-680c8cc94dbb-kube-api-access-v2ng8\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgmpv\" (UID: \"34322326-016a-4e58-b14c-680c8cc94dbb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788759 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f36aa73-2c64-431b-8991-37312e054756-metrics-tls\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788782 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rjf2\" (UniqueName: \"kubernetes.io/projected/4f36aa73-2c64-431b-8991-37312e054756-kube-api-access-9rjf2\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788819 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlzhx\" (UniqueName: \"kubernetes.io/projected/c670b59a-b4ec-4332-9a76-72fee4666277-kube-api-access-qlzhx\") pod \"downloads-7954f5f757-w7rrv\" (UID: \"c670b59a-b4ec-4332-9a76-72fee4666277\") " pod="openshift-console/downloads-7954f5f757-w7rrv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788911 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b1493cb-71c1-4283-9c81-b73014189a60-serving-cert\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788933 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b1493cb-71c1-4283-9c81-b73014189a60-trusted-ca\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788979 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17feaa75-00bd-4b47-a857-5fa5b27427fb-config\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789031 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqjs8\" (UniqueName: \"kubernetes.io/projected/17feaa75-00bd-4b47-a857-5fa5b27427fb-kube-api-access-dqjs8\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789092 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34322326-016a-4e58-b14c-680c8cc94dbb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgmpv\" (UID: \"34322326-016a-4e58-b14c-680c8cc94dbb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789131 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34322326-016a-4e58-b14c-680c8cc94dbb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgmpv\" (UID: \"34322326-016a-4e58-b14c-680c8cc94dbb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789159 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-oauth-serving-cert\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789214 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17feaa75-00bd-4b47-a857-5fa5b27427fb-etcd-client\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789237 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5lg6\" (UniqueName: \"kubernetes.io/projected/a583e679-8e90-4f82-b286-3eda40831c72-kube-api-access-x5lg6\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789259 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2670838d-90c6-490a-a620-676073872108-auth-proxy-config\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789285 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a583e679-8e90-4f82-b286-3eda40831c72-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789326 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b161a36-8654-4948-8412-bb68940fe512-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4wktl\" (UID: \"2b161a36-8654-4948-8412-bb68940fe512\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789350 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwrww\" (UniqueName: \"kubernetes.io/projected/ebab7917-b306-46a8-8dc7-f99b4b162c71-kube-api-access-jwrww\") pod \"openshift-apiserver-operator-796bbdcf4f-4mdfm\" (UID: \"ebab7917-b306-46a8-8dc7-f99b4b162c71\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789374 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m98wx\" (UniqueName: \"kubernetes.io/projected/4082750e-cf12-45b4-8920-63f31ad1cc28-kube-api-access-m98wx\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789398 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b1493cb-71c1-4283-9c81-b73014189a60-config\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789421 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvwp5\" (UniqueName: \"kubernetes.io/projected/0b1493cb-71c1-4283-9c81-b73014189a60-kube-api-access-vvwp5\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789441 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-console-config\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789464 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-trusted-ca-bundle\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789497 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fblmc\" (UniqueName: \"kubernetes.io/projected/2b161a36-8654-4948-8412-bb68940fe512-kube-api-access-fblmc\") pod \"control-plane-machine-set-operator-78cbb6b69f-4wktl\" (UID: \"2b161a36-8654-4948-8412-bb68940fe512\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789532 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f36aa73-2c64-431b-8991-37312e054756-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789557 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/17feaa75-00bd-4b47-a857-5fa5b27427fb-etcd-ca\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789578 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a583e679-8e90-4f82-b286-3eda40831c72-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789614 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a583e679-8e90-4f82-b286-3eda40831c72-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789640 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz4t6\" (UniqueName: \"kubernetes.io/projected/2670838d-90c6-490a-a620-676073872108-kube-api-access-pz4t6\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789664 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebab7917-b306-46a8-8dc7-f99b4b162c71-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4mdfm\" (UID: \"ebab7917-b306-46a8-8dc7-f99b4b162c71\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789836 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-serving-cert\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789867 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2670838d-90c6-490a-a620-676073872108-machine-approver-tls\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789894 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebab7917-b306-46a8-8dc7-f99b4b162c71-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4mdfm\" (UID: \"ebab7917-b306-46a8-8dc7-f99b4b162c71\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789916 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-oauth-config\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789938 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-service-ca\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.790280 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2670838d-90c6-490a-a620-676073872108-config\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.790799 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-service-ca\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.791331 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f36aa73-2c64-431b-8991-37312e054756-trusted-ca\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.791498 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-console-config\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.791772 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17feaa75-00bd-4b47-a857-5fa5b27427fb-serving-cert\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.789698 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/17feaa75-00bd-4b47-a857-5fa5b27427fb-etcd-service-ca\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.791888 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b1493cb-71c1-4283-9c81-b73014189a60-config\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.792217 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-trusted-ca-bundle\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.792338 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f36aa73-2c64-431b-8991-37312e054756-metrics-tls\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.792854 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/17feaa75-00bd-4b47-a857-5fa5b27427fb-etcd-ca\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.793021 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17feaa75-00bd-4b47-a857-5fa5b27427fb-config\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.793242 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a583e679-8e90-4f82-b286-3eda40831c72-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.788377 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.793407 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2670838d-90c6-490a-a620-676073872108-auth-proxy-config\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.793277 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b1493cb-71c1-4283-9c81-b73014189a60-trusted-ca\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.793765 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebab7917-b306-46a8-8dc7-f99b4b162c71-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4mdfm\" (UID: \"ebab7917-b306-46a8-8dc7-f99b4b162c71\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.794174 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-oauth-serving-cert\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.794787 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebab7917-b306-46a8-8dc7-f99b4b162c71-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4mdfm\" (UID: \"ebab7917-b306-46a8-8dc7-f99b4b162c71\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.795328 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17feaa75-00bd-4b47-a857-5fa5b27427fb-etcd-client\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.795356 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-serving-cert\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.795814 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2670838d-90c6-490a-a620-676073872108-machine-approver-tls\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.797075 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a583e679-8e90-4f82-b286-3eda40831c72-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.797592 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b1493cb-71c1-4283-9c81-b73014189a60-serving-cert\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.798985 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-oauth-config\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.808279 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.828462 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.848168 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.870881 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.888743 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.909201 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.929723 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.949013 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.949508 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.969260 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 02 11:21:10 crc kubenswrapper[4658]: I1002 11:21:10.990258 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.009874 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.028908 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.048590 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.071180 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.088906 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.108902 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.128835 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.148836 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.190109 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.208915 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.214735 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34322326-016a-4e58-b14c-680c8cc94dbb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgmpv\" (UID: \"34322326-016a-4e58-b14c-680c8cc94dbb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.228698 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.238885 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34322326-016a-4e58-b14c-680c8cc94dbb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgmpv\" (UID: \"34322326-016a-4e58-b14c-680c8cc94dbb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.250044 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.269748 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.289558 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.297572 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b161a36-8654-4948-8412-bb68940fe512-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4wktl\" (UID: \"2b161a36-8654-4948-8412-bb68940fe512\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.309081 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.350419 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lf7x\" (UniqueName: \"kubernetes.io/projected/bfa1953c-4c82-4463-b772-6b871bcea9b8-kube-api-access-8lf7x\") pod \"machine-api-operator-5694c8668f-gjt96\" (UID: \"bfa1953c-4c82-4463-b772-6b871bcea9b8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.389145 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.394992 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.395925 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cktm\" (UniqueName: \"kubernetes.io/projected/93963d75-dbb2-414c-9218-aee78bb8f819-kube-api-access-9cktm\") pod \"apiserver-76f77b778f-xbkft\" (UID: \"93963d75-dbb2-414c-9218-aee78bb8f819\") " pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.409960 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.429335 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.449703 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.470681 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.489159 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.527236 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd75n\" (UniqueName: \"kubernetes.io/projected/dd736d13-0140-458a-bbdf-bed6d2e55ce1-kube-api-access-jd75n\") pod \"openshift-config-operator-7777fb866f-j267v\" (UID: \"dd736d13-0140-458a-bbdf-bed6d2e55ce1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.531279 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.549186 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.569192 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.589153 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.609411 4658 request.go:700] Waited for 1.014325681s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.611164 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.628961 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.629883 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gjt96"] Oct 02 11:21:11 crc kubenswrapper[4658]: W1002 11:21:11.638078 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa1953c_4c82_4463_b772_6b871bcea9b8.slice/crio-917851f2dd343d8aebd13c859fd97bef4337ab576fb29c9020d798b92c1ea097 WatchSource:0}: Error finding container 917851f2dd343d8aebd13c859fd97bef4337ab576fb29c9020d798b92c1ea097: Status 404 returned error can't find the container with id 917851f2dd343d8aebd13c859fd97bef4337ab576fb29c9020d798b92c1ea097 Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.649151 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.660928 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.670316 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.690078 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.708762 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.720600 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.731465 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.749897 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.750174 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" event={"ID":"bfa1953c-4c82-4463-b772-6b871bcea9b8","Type":"ContainerStarted","Data":"917851f2dd343d8aebd13c859fd97bef4337ab576fb29c9020d798b92c1ea097"} Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.769600 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.793578 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.808657 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.825930 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xbkft"] Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.828906 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.849799 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.869814 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.889504 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.909161 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.915702 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-j267v"] Oct 02 11:21:11 crc kubenswrapper[4658]: W1002 11:21:11.920854 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd736d13_0140_458a_bbdf_bed6d2e55ce1.slice/crio-4df0feb2f3d146cdca163026ed0661293d900ab816db2dcdd4a8d7ee1ff23e37 WatchSource:0}: Error finding container 4df0feb2f3d146cdca163026ed0661293d900ab816db2dcdd4a8d7ee1ff23e37: Status 404 returned error can't find the container with id 4df0feb2f3d146cdca163026ed0661293d900ab816db2dcdd4a8d7ee1ff23e37 Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.928575 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.948152 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.948640 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.948817 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.949235 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.968878 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 02 11:21:11 crc kubenswrapper[4658]: I1002 11:21:11.988882 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.009039 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.029267 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.049465 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.069532 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.089581 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.108707 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.128395 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.149100 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.170854 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.189604 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.208985 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.228779 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.249166 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.269683 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.289617 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.309557 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.328994 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.349834 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.370379 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.390115 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.409236 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.429879 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.449557 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.470376 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.489508 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.509384 4658 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.529151 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.570032 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rjf2\" (UniqueName: \"kubernetes.io/projected/4f36aa73-2c64-431b-8991-37312e054756-kube-api-access-9rjf2\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.600617 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2ng8\" (UniqueName: \"kubernetes.io/projected/34322326-016a-4e58-b14c-680c8cc94dbb-kube-api-access-v2ng8\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgmpv\" (UID: \"34322326-016a-4e58-b14c-680c8cc94dbb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.616831 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98wx\" (UniqueName: \"kubernetes.io/projected/4082750e-cf12-45b4-8920-63f31ad1cc28-kube-api-access-m98wx\") pod \"console-f9d7485db-md7fr\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.627964 4658 request.go:700] Waited for 1.837004629s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.635968 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlzhx\" (UniqueName: \"kubernetes.io/projected/c670b59a-b4ec-4332-9a76-72fee4666277-kube-api-access-qlzhx\") pod \"downloads-7954f5f757-w7rrv\" (UID: \"c670b59a-b4ec-4332-9a76-72fee4666277\") " pod="openshift-console/downloads-7954f5f757-w7rrv" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.651658 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a583e679-8e90-4f82-b286-3eda40831c72-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.668148 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvwp5\" (UniqueName: \"kubernetes.io/projected/0b1493cb-71c1-4283-9c81-b73014189a60-kube-api-access-vvwp5\") pod \"console-operator-58897d9998-v4m9t\" (UID: \"0b1493cb-71c1-4283-9c81-b73014189a60\") " pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.689923 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fblmc\" (UniqueName: \"kubernetes.io/projected/2b161a36-8654-4948-8412-bb68940fe512-kube-api-access-fblmc\") pod \"control-plane-machine-set-operator-78cbb6b69f-4wktl\" (UID: \"2b161a36-8654-4948-8412-bb68940fe512\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.705873 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f36aa73-2c64-431b-8991-37312e054756-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x64tz\" (UID: \"4f36aa73-2c64-431b-8991-37312e054756\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.720848 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqjs8\" (UniqueName: \"kubernetes.io/projected/17feaa75-00bd-4b47-a857-5fa5b27427fb-kube-api-access-dqjs8\") pod \"etcd-operator-b45778765-47rr5\" (UID: \"17feaa75-00bd-4b47-a857-5fa5b27427fb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.740705 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w7rrv" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.752193 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5lg6\" (UniqueName: \"kubernetes.io/projected/a583e679-8e90-4f82-b286-3eda40831c72-kube-api-access-x5lg6\") pod \"cluster-image-registry-operator-dc59b4c8b-f2wf8\" (UID: \"a583e679-8e90-4f82-b286-3eda40831c72\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.757211 4658 generic.go:334] "Generic (PLEG): container finished" podID="93963d75-dbb2-414c-9218-aee78bb8f819" containerID="dfb69f418e7a4839824929ea4a2da10b3449a94845dede94e56128a82349143d" exitCode=0 Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.757437 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" event={"ID":"93963d75-dbb2-414c-9218-aee78bb8f819","Type":"ContainerDied","Data":"dfb69f418e7a4839824929ea4a2da10b3449a94845dede94e56128a82349143d"} Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.757613 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" event={"ID":"93963d75-dbb2-414c-9218-aee78bb8f819","Type":"ContainerStarted","Data":"ceee70f29e2dca88b7d435893016b7c260790aba9482ff0373e6a4e2bc5ec37d"} Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.761097 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" event={"ID":"bfa1953c-4c82-4463-b772-6b871bcea9b8","Type":"ContainerStarted","Data":"360e5f3d6f36367ed53425d45674cc8e0f18cb931b58dd347a4599d061940c55"} Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.761157 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" event={"ID":"bfa1953c-4c82-4463-b772-6b871bcea9b8","Type":"ContainerStarted","Data":"8e340241aa5c23859cb4f2d96ef48f1db20f1fd0f991be9ff564c6d35a6e6465"} Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.771026 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz4t6\" (UniqueName: \"kubernetes.io/projected/2670838d-90c6-490a-a620-676073872108-kube-api-access-pz4t6\") pod \"machine-approver-56656f9798-chn4z\" (UID: \"2670838d-90c6-490a-a620-676073872108\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.771586 4658 generic.go:334] "Generic (PLEG): container finished" podID="dd736d13-0140-458a-bbdf-bed6d2e55ce1" containerID="18270ec531cd4e881879484ff6b178634386ad90b2de19a5e324e8a2afec2fc7" exitCode=0 Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.771741 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" event={"ID":"dd736d13-0140-458a-bbdf-bed6d2e55ce1","Type":"ContainerDied","Data":"18270ec531cd4e881879484ff6b178634386ad90b2de19a5e324e8a2afec2fc7"} Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.771851 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" event={"ID":"dd736d13-0140-458a-bbdf-bed6d2e55ce1","Type":"ContainerStarted","Data":"4df0feb2f3d146cdca163026ed0661293d900ab816db2dcdd4a8d7ee1ff23e37"} Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.789625 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.790881 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwrww\" (UniqueName: \"kubernetes.io/projected/ebab7917-b306-46a8-8dc7-f99b4b162c71-kube-api-access-jwrww\") pod \"openshift-apiserver-operator-796bbdcf4f-4mdfm\" (UID: \"ebab7917-b306-46a8-8dc7-f99b4b162c71\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.809731 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.812286 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.825765 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.839186 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.845886 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.870525 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.885749 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.890095 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.894268 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.909399 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.930880 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.942959 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqvx8\" (UniqueName: \"kubernetes.io/projected/60f101b6-dee6-41af-8943-cd8ebfd1d528-kube-api-access-kqvx8\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.943004 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.943101 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/02ca41da-9f6c-4432-9041-32e3aeba0e92-encryption-config\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.943135 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.943184 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.943250 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48a6ed4-aed1-433f-85d2-08e6beaea953-serving-cert\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.943281 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-trusted-ca\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.943339 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b54e3fe-e025-45e2-bbf2-43f6ccadc773-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6qb9w\" (UID: \"8b54e3fe-e025-45e2-bbf2-43f6ccadc773\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.943474 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w7rrv"] Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944163 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02ca41da-9f6c-4432-9041-32e3aeba0e92-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944202 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944241 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/02ca41da-9f6c-4432-9041-32e3aeba0e92-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944268 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjq4n\" (UniqueName: \"kubernetes.io/projected/e5b9eed3-8130-46f7-9418-caa829997f64-kube-api-access-vjq4n\") pod \"cluster-samples-operator-665b6dd947-6fx7w\" (UID: \"e5b9eed3-8130-46f7-9418-caa829997f64\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944367 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944391 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02ca41da-9f6c-4432-9041-32e3aeba0e92-audit-dir\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944412 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b54e3fe-e025-45e2-bbf2-43f6ccadc773-config\") pod \"kube-apiserver-operator-766d6c64bb-6qb9w\" (UID: \"8b54e3fe-e025-45e2-bbf2-43f6ccadc773\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944435 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02ca41da-9f6c-4432-9041-32e3aeba0e92-serving-cert\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944459 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsh8c\" (UniqueName: \"kubernetes.io/projected/02ca41da-9f6c-4432-9041-32e3aeba0e92-kube-api-access-fsh8c\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944512 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-serving-cert\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944580 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f101b6-dee6-41af-8943-cd8ebfd1d528-serving-cert\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944607 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b99dd62-8d35-4423-a53a-da7654a17fb7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944665 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944696 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-tls\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944721 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c784dcc-2a24-462a-aaf8-7c3cf4d2d588-proxy-tls\") pod \"machine-config-controller-84d6567774-2ltpz\" (UID: \"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944747 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-dir\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944773 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qdv\" (UniqueName: \"kubernetes.io/projected/462e2be7-576b-4077-8e50-13b60aafa1bb-kube-api-access-46qdv\") pod \"openshift-controller-manager-operator-756b6f6bc6-66nv9\" (UID: \"462e2be7-576b-4077-8e50-13b60aafa1bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944807 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944832 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-config\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944856 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh58m\" (UniqueName: \"kubernetes.io/projected/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-kube-api-access-rh58m\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944914 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b54e3fe-e025-45e2-bbf2-43f6ccadc773-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6qb9w\" (UID: \"8b54e3fe-e025-45e2-bbf2-43f6ccadc773\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.944983 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/02ca41da-9f6c-4432-9041-32e3aeba0e92-etcd-client\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945010 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945033 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-config\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945061 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945088 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-config\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945109 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02ca41da-9f6c-4432-9041-32e3aeba0e92-audit-policies\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945133 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jtrk\" (UniqueName: \"kubernetes.io/projected/40342360-13dd-4953-805b-354528d0879d-kube-api-access-6jtrk\") pod \"dns-operator-744455d44c-5lrcf\" (UID: \"40342360-13dd-4953-805b-354528d0879d\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945157 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945182 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462e2be7-576b-4077-8e50-13b60aafa1bb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-66nv9\" (UID: \"462e2be7-576b-4077-8e50-13b60aafa1bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945207 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945234 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c784dcc-2a24-462a-aaf8-7c3cf4d2d588-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2ltpz\" (UID: \"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945266 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19e9649a-4386-4922-9cac-a57b34aa4d2e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vn5bf\" (UID: \"19e9649a-4386-4922-9cac-a57b34aa4d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945317 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-client-ca\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945360 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8phzj\" (UniqueName: \"kubernetes.io/projected/4c784dcc-2a24-462a-aaf8-7c3cf4d2d588-kube-api-access-8phzj\") pod \"machine-config-controller-84d6567774-2ltpz\" (UID: \"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945383 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w274r\" (UniqueName: \"kubernetes.io/projected/a48a6ed4-aed1-433f-85d2-08e6beaea953-kube-api-access-w274r\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945429 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/58ab4a40-4a69-4505-b60e-32b8b62eaeb5-srv-cert\") pod \"olm-operator-6b444d44fb-b856h\" (UID: \"58ab4a40-4a69-4505-b60e-32b8b62eaeb5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945486 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-certificates\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945563 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-bound-sa-token\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945589 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19e9649a-4386-4922-9cac-a57b34aa4d2e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vn5bf\" (UID: \"19e9649a-4386-4922-9cac-a57b34aa4d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945893 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:12 crc kubenswrapper[4658]: E1002 11:21:12.946263 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:13.446244158 +0000 UTC m=+154.337397825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.945920 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcwf4\" (UniqueName: \"kubernetes.io/projected/58ab4a40-4a69-4505-b60e-32b8b62eaeb5-kube-api-access-rcwf4\") pod \"olm-operator-6b444d44fb-b856h\" (UID: \"58ab4a40-4a69-4505-b60e-32b8b62eaeb5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.946706 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-client-ca\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.946730 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-service-ca-bundle\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.946824 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b99dd62-8d35-4423-a53a-da7654a17fb7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.946910 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv8xv\" (UniqueName: \"kubernetes.io/projected/fecb5f70-edd2-466b-a31f-25b1db79aec5-kube-api-access-pv8xv\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.946936 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-policies\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.947345 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40342360-13dd-4953-805b-354528d0879d-metrics-tls\") pod \"dns-operator-744455d44c-5lrcf\" (UID: \"40342360-13dd-4953-805b-354528d0879d\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.947369 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e9649a-4386-4922-9cac-a57b34aa4d2e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vn5bf\" (UID: \"19e9649a-4386-4922-9cac-a57b34aa4d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.947396 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/58ab4a40-4a69-4505-b60e-32b8b62eaeb5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-b856h\" (UID: \"58ab4a40-4a69-4505-b60e-32b8b62eaeb5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.947440 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5b9eed3-8130-46f7-9418-caa829997f64-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6fx7w\" (UID: \"e5b9eed3-8130-46f7-9418-caa829997f64\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.947531 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/462e2be7-576b-4077-8e50-13b60aafa1bb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-66nv9\" (UID: \"462e2be7-576b-4077-8e50-13b60aafa1bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.947557 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.947580 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:12 crc kubenswrapper[4658]: I1002 11:21:12.947848 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldzb\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-kube-api-access-6ldzb\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:12 crc kubenswrapper[4658]: W1002 11:21:12.954044 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc670b59a_b4ec_4332_9a76_72fee4666277.slice/crio-5c334cd464d77e553ccbcc23b493c23892d3bd26dc389d51157a6ae3ea646758 WatchSource:0}: Error finding container 5c334cd464d77e553ccbcc23b493c23892d3bd26dc389d51157a6ae3ea646758: Status 404 returned error can't find the container with id 5c334cd464d77e553ccbcc23b493c23892d3bd26dc389d51157a6ae3ea646758 Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.021410 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.028291 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.043949 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-md7fr"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052374 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.052517 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:13.552489082 +0000 UTC m=+154.443642639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052572 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/462e2be7-576b-4077-8e50-13b60aafa1bb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-66nv9\" (UID: \"462e2be7-576b-4077-8e50-13b60aafa1bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052600 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-664jz\" (UniqueName: \"kubernetes.io/projected/faad5b47-a113-424c-bba9-a681a4107f98-kube-api-access-664jz\") pod \"service-ca-9c57cc56f-m5pf4\" (UID: \"faad5b47-a113-424c-bba9-a681a4107f98\") " pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052620 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052638 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9414d700-2392-4e49-b703-8bcf624bdf60-images\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052666 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052703 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-socket-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052719 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tgmnk\" (UID: \"02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052737 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldzb\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-kube-api-access-6ldzb\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052753 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqvx8\" (UniqueName: \"kubernetes.io/projected/60f101b6-dee6-41af-8943-cd8ebfd1d528-kube-api-access-kqvx8\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052767 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052817 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/02ca41da-9f6c-4432-9041-32e3aeba0e92-encryption-config\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052864 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.052894 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.053723 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.055104 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48a6ed4-aed1-433f-85d2-08e6beaea953-serving-cert\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.055130 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a87e683d-1f76-40b8-bfeb-b06076224893-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-d5zmz\" (UID: \"a87e683d-1f76-40b8-bfeb-b06076224893\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.055148 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/faad5b47-a113-424c-bba9-a681a4107f98-signing-cabundle\") pod \"service-ca-9c57cc56f-m5pf4\" (UID: \"faad5b47-a113-424c-bba9-a681a4107f98\") " pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.055165 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e526afb-63d8-4825-87c5-d039c0b81aeb-config-volume\") pod \"dns-default-l6vlk\" (UID: \"8e526afb-63d8-4825-87c5-d039c0b81aeb\") " pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.055184 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-trusted-ca\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.059186 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-trusted-ca\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.060362 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b54e3fe-e025-45e2-bbf2-43f6ccadc773-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6qb9w\" (UID: \"8b54e3fe-e025-45e2-bbf2-43f6ccadc773\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.060396 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gd7h\" (UniqueName: \"kubernetes.io/projected/78ef8aa8-69b1-43ae-844a-9b3bed415a4a-kube-api-access-4gd7h\") pod \"ingress-canary-8xwb5\" (UID: \"78ef8aa8-69b1-43ae-844a-9b3bed415a4a\") " pod="openshift-ingress-canary/ingress-canary-8xwb5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.060417 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjcw\" (UniqueName: \"kubernetes.io/projected/e3384716-8ea8-411a-a4e3-a50ec1cf6790-kube-api-access-qqjcw\") pod \"migrator-59844c95c7-j6d5m\" (UID: \"e3384716-8ea8-411a-a4e3-a50ec1cf6790\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.060434 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q5bt5\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.061676 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.061739 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02ca41da-9f6c-4432-9041-32e3aeba0e92-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.061763 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.061796 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ef8f5c-e4a0-41be-ac66-47ded9a4fc52-serving-cert\") pod \"service-ca-operator-777779d784-4lz8p\" (UID: \"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.061821 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgll6\" (UniqueName: \"kubernetes.io/projected/45c25ebf-9993-4c4d-843b-5084afce8cfa-kube-api-access-dgll6\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.061848 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/02ca41da-9f6c-4432-9041-32e3aeba0e92-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.062479 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02ca41da-9f6c-4432-9041-32e3aeba0e92-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.063128 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.065135 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/02ca41da-9f6c-4432-9041-32e3aeba0e92-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.065608 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/462e2be7-576b-4077-8e50-13b60aafa1bb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-66nv9\" (UID: \"462e2be7-576b-4077-8e50-13b60aafa1bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.067154 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48a6ed4-aed1-433f-85d2-08e6beaea953-serving-cert\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.067769 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.067835 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjq4n\" (UniqueName: \"kubernetes.io/projected/e5b9eed3-8130-46f7-9418-caa829997f64-kube-api-access-vjq4n\") pod \"cluster-samples-operator-665b6dd947-6fx7w\" (UID: \"e5b9eed3-8130-46f7-9418-caa829997f64\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.068158 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0912be1c-00d6-47fb-84fa-58b6569ea434-metrics-certs\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.068392 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5689bcf3-2722-4d26-8ee2-ebccfa61da08-webhook-cert\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.068463 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.069088 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02ca41da-9f6c-4432-9041-32e3aeba0e92-audit-dir\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.069145 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xpj\" (UniqueName: \"kubernetes.io/projected/02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c-kube-api-access-s8xpj\") pod \"multus-admission-controller-857f4d67dd-tgmnk\" (UID: \"02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.069249 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02ca41da-9f6c-4432-9041-32e3aeba0e92-audit-dir\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.069460 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b54e3fe-e025-45e2-bbf2-43f6ccadc773-config\") pod \"kube-apiserver-operator-766d6c64bb-6qb9w\" (UID: \"8b54e3fe-e025-45e2-bbf2-43f6ccadc773\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.069627 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02ca41da-9f6c-4432-9041-32e3aeba0e92-serving-cert\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.069743 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdpss\" (UniqueName: \"kubernetes.io/projected/571e8f9f-9662-4139-9cf5-51093519d329-kube-api-access-kdpss\") pod \"marketplace-operator-79b997595-q5bt5\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.072376 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.073275 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.073732 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87e683d-1f76-40b8-bfeb-b06076224893-config\") pod \"kube-controller-manager-operator-78b949d7b-d5zmz\" (UID: \"a87e683d-1f76-40b8-bfeb-b06076224893\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.073772 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d37daf8-ad4f-4531-b2e7-04adeda4de89-profile-collector-cert\") pod \"catalog-operator-68c6474976-qlp92\" (UID: \"6d37daf8-ad4f-4531-b2e7-04adeda4de89\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.073845 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29djb\" (UniqueName: \"kubernetes.io/projected/89e4a0f1-ceb6-41f9-ab80-e397d4962f59-kube-api-access-29djb\") pod \"machine-config-server-hmn5s\" (UID: \"89e4a0f1-ceb6-41f9-ab80-e397d4962f59\") " pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.073877 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-csi-data-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.073915 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q5bt5\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.073966 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7d11297-76f5-4bdd-a744-57ad6376de77-secret-volume\") pod \"collect-profiles-29323395-h867z\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.073995 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsh8c\" (UniqueName: \"kubernetes.io/projected/02ca41da-9f6c-4432-9041-32e3aeba0e92-kube-api-access-fsh8c\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.074018 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78ef8aa8-69b1-43ae-844a-9b3bed415a4a-cert\") pod \"ingress-canary-8xwb5\" (UID: \"78ef8aa8-69b1-43ae-844a-9b3bed415a4a\") " pod="openshift-ingress-canary/ingress-canary-8xwb5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.074136 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f101b6-dee6-41af-8943-cd8ebfd1d528-serving-cert\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.074245 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-serving-cert\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.074275 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.074318 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-mountpoint-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.074342 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b99dd62-8d35-4423-a53a-da7654a17fb7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.074416 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-tls\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.075809 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/02ca41da-9f6c-4432-9041-32e3aeba0e92-encryption-config\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.075950 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b99dd62-8d35-4423-a53a-da7654a17fb7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.076682 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qdv\" (UniqueName: \"kubernetes.io/projected/462e2be7-576b-4077-8e50-13b60aafa1bb-kube-api-access-46qdv\") pod \"openshift-controller-manager-operator-756b6f6bc6-66nv9\" (UID: \"462e2be7-576b-4077-8e50-13b60aafa1bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.076773 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c784dcc-2a24-462a-aaf8-7c3cf4d2d588-proxy-tls\") pod \"machine-config-controller-84d6567774-2ltpz\" (UID: \"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.076813 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-dir\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.076852 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/89e4a0f1-ceb6-41f9-ab80-e397d4962f59-certs\") pod \"machine-config-server-hmn5s\" (UID: \"89e4a0f1-ceb6-41f9-ab80-e397d4962f59\") " pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.077090 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-dir\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.077710 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.078131 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-registration-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.078201 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87e683d-1f76-40b8-bfeb-b06076224893-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-d5zmz\" (UID: \"a87e683d-1f76-40b8-bfeb-b06076224893\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.078574 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-config\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.078607 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh58m\" (UniqueName: \"kubernetes.io/projected/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-kube-api-access-rh58m\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.078577 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b54e3fe-e025-45e2-bbf2-43f6ccadc773-config\") pod \"kube-apiserver-operator-766d6c64bb-6qb9w\" (UID: \"8b54e3fe-e025-45e2-bbf2-43f6ccadc773\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.078842 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b54e3fe-e025-45e2-bbf2-43f6ccadc773-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6qb9w\" (UID: \"8b54e3fe-e025-45e2-bbf2-43f6ccadc773\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.078900 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/02ca41da-9f6c-4432-9041-32e3aeba0e92-etcd-client\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.078928 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.079621 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/89e4a0f1-ceb6-41f9-ab80-e397d4962f59-node-bootstrap-token\") pod \"machine-config-server-hmn5s\" (UID: \"89e4a0f1-ceb6-41f9-ab80-e397d4962f59\") " pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.079680 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ns7\" (UniqueName: \"kubernetes.io/projected/6d37daf8-ad4f-4531-b2e7-04adeda4de89-kube-api-access-s4ns7\") pod \"catalog-operator-68c6474976-qlp92\" (UID: \"6d37daf8-ad4f-4531-b2e7-04adeda4de89\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.079690 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.079792 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-config\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.079836 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2af92ba0-9e7b-4a45-9513-70217f77a845-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g6jkm\" (UID: \"2af92ba0-9e7b-4a45-9513-70217f77a845\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.080511 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.080556 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/faad5b47-a113-424c-bba9-a681a4107f98-signing-key\") pod \"service-ca-9c57cc56f-m5pf4\" (UID: \"faad5b47-a113-424c-bba9-a681a4107f98\") " pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.080625 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-config\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.080915 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-config\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081031 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081192 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02ca41da-9f6c-4432-9041-32e3aeba0e92-audit-policies\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081262 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0912be1c-00d6-47fb-84fa-58b6569ea434-stats-auth\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081358 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-config\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081617 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jtrk\" (UniqueName: \"kubernetes.io/projected/40342360-13dd-4953-805b-354528d0879d-kube-api-access-6jtrk\") pod \"dns-operator-744455d44c-5lrcf\" (UID: \"40342360-13dd-4953-805b-354528d0879d\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081708 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081748 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fn4d\" (UniqueName: \"kubernetes.io/projected/2af92ba0-9e7b-4a45-9513-70217f77a845-kube-api-access-6fn4d\") pod \"package-server-manager-789f6589d5-g6jkm\" (UID: \"2af92ba0-9e7b-4a45-9513-70217f77a845\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081829 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5689bcf3-2722-4d26-8ee2-ebccfa61da08-tmpfs\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081876 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462e2be7-576b-4077-8e50-13b60aafa1bb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-66nv9\" (UID: \"462e2be7-576b-4077-8e50-13b60aafa1bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081904 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081931 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ef8f5c-e4a0-41be-ac66-47ded9a4fc52-config\") pod \"service-ca-operator-777779d784-4lz8p\" (UID: \"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081955 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62mk4\" (UniqueName: \"kubernetes.io/projected/c7d11297-76f5-4bdd-a744-57ad6376de77-kube-api-access-62mk4\") pod \"collect-profiles-29323395-h867z\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.081977 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7d11297-76f5-4bdd-a744-57ad6376de77-config-volume\") pod \"collect-profiles-29323395-h867z\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082055 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02ca41da-9f6c-4432-9041-32e3aeba0e92-serving-cert\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082186 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c784dcc-2a24-462a-aaf8-7c3cf4d2d588-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2ltpz\" (UID: \"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082233 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19e9649a-4386-4922-9cac-a57b34aa4d2e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vn5bf\" (UID: \"19e9649a-4386-4922-9cac-a57b34aa4d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082270 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0912be1c-00d6-47fb-84fa-58b6569ea434-default-certificate\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082608 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-client-ca\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082682 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0912be1c-00d6-47fb-84fa-58b6569ea434-service-ca-bundle\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082700 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b54e3fe-e025-45e2-bbf2-43f6ccadc773-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6qb9w\" (UID: \"8b54e3fe-e025-45e2-bbf2-43f6ccadc773\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082709 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-plugins-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082788 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9414d700-2392-4e49-b703-8bcf624bdf60-proxy-tls\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082850 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8phzj\" (UniqueName: \"kubernetes.io/projected/4c784dcc-2a24-462a-aaf8-7c3cf4d2d588-kube-api-access-8phzj\") pod \"machine-config-controller-84d6567774-2ltpz\" (UID: \"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082899 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w274r\" (UniqueName: \"kubernetes.io/projected/a48a6ed4-aed1-433f-85d2-08e6beaea953-kube-api-access-w274r\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082967 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/58ab4a40-4a69-4505-b60e-32b8b62eaeb5-srv-cert\") pod \"olm-operator-6b444d44fb-b856h\" (UID: \"58ab4a40-4a69-4505-b60e-32b8b62eaeb5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.083003 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwn95\" (UniqueName: \"kubernetes.io/projected/5689bcf3-2722-4d26-8ee2-ebccfa61da08-kube-api-access-gwn95\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.082968 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-config\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.083114 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462e2be7-576b-4077-8e50-13b60aafa1bb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-66nv9\" (UID: \"462e2be7-576b-4077-8e50-13b60aafa1bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.083141 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-certificates\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.083194 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c784dcc-2a24-462a-aaf8-7c3cf4d2d588-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2ltpz\" (UID: \"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.083445 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-bound-sa-token\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.083514 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19e9649a-4386-4922-9cac-a57b34aa4d2e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vn5bf\" (UID: \"19e9649a-4386-4922-9cac-a57b34aa4d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.083544 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5689bcf3-2722-4d26-8ee2-ebccfa61da08-apiservice-cert\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.083734 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-client-ca\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.083844 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.084104 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcwf4\" (UniqueName: \"kubernetes.io/projected/58ab4a40-4a69-4505-b60e-32b8b62eaeb5-kube-api-access-rcwf4\") pod \"olm-operator-6b444d44fb-b856h\" (UID: \"58ab4a40-4a69-4505-b60e-32b8b62eaeb5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.084141 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9414d700-2392-4e49-b703-8bcf624bdf60-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.084437 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:13.584413537 +0000 UTC m=+154.475567104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.084523 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-certificates\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.084659 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-client-ca\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.084685 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-service-ca-bundle\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.084712 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnbfv\" (UniqueName: \"kubernetes.io/projected/01ef8f5c-e4a0-41be-ac66-47ded9a4fc52-kube-api-access-mnbfv\") pod \"service-ca-operator-777779d784-4lz8p\" (UID: \"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.084824 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zb9z\" (UniqueName: \"kubernetes.io/projected/0912be1c-00d6-47fb-84fa-58b6569ea434-kube-api-access-2zb9z\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.084846 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxvsc\" (UniqueName: \"kubernetes.io/projected/8e526afb-63d8-4825-87c5-d039c0b81aeb-kube-api-access-zxvsc\") pod \"dns-default-l6vlk\" (UID: \"8e526afb-63d8-4825-87c5-d039c0b81aeb\") " pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.084870 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hfzd\" (UniqueName: \"kubernetes.io/projected/9414d700-2392-4e49-b703-8bcf624bdf60-kube-api-access-5hfzd\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.085588 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-client-ca\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.085944 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-service-ca-bundle\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.086767 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.087020 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-serving-cert\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.087672 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f101b6-dee6-41af-8943-cd8ebfd1d528-serving-cert\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.088200 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-tls\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.088412 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v4m9t"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.088456 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b99dd62-8d35-4423-a53a-da7654a17fb7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.088716 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-policies\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.088768 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv8xv\" (UniqueName: \"kubernetes.io/projected/fecb5f70-edd2-466b-a31f-25b1db79aec5-kube-api-access-pv8xv\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.088808 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40342360-13dd-4953-805b-354528d0879d-metrics-tls\") pod \"dns-operator-744455d44c-5lrcf\" (UID: \"40342360-13dd-4953-805b-354528d0879d\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.088848 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e9649a-4386-4922-9cac-a57b34aa4d2e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vn5bf\" (UID: \"19e9649a-4386-4922-9cac-a57b34aa4d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.088884 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/58ab4a40-4a69-4505-b60e-32b8b62eaeb5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-b856h\" (UID: \"58ab4a40-4a69-4505-b60e-32b8b62eaeb5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.088954 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e526afb-63d8-4825-87c5-d039c0b81aeb-metrics-tls\") pod \"dns-default-l6vlk\" (UID: \"8e526afb-63d8-4825-87c5-d039c0b81aeb\") " pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.089015 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5b9eed3-8130-46f7-9418-caa829997f64-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6fx7w\" (UID: \"e5b9eed3-8130-46f7-9418-caa829997f64\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.089043 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d37daf8-ad4f-4531-b2e7-04adeda4de89-srv-cert\") pod \"catalog-operator-68c6474976-qlp92\" (UID: \"6d37daf8-ad4f-4531-b2e7-04adeda4de89\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.089477 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02ca41da-9f6c-4432-9041-32e3aeba0e92-audit-policies\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.089523 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.089756 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-policies\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.089869 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.090462 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c784dcc-2a24-462a-aaf8-7c3cf4d2d588-proxy-tls\") pod \"machine-config-controller-84d6567774-2ltpz\" (UID: \"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.090647 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/02ca41da-9f6c-4432-9041-32e3aeba0e92-etcd-client\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.091099 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.092356 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.094310 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e9649a-4386-4922-9cac-a57b34aa4d2e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vn5bf\" (UID: \"19e9649a-4386-4922-9cac-a57b34aa4d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.094805 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19e9649a-4386-4922-9cac-a57b34aa4d2e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vn5bf\" (UID: \"19e9649a-4386-4922-9cac-a57b34aa4d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.095692 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldzb\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-kube-api-access-6ldzb\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.095813 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b99dd62-8d35-4423-a53a-da7654a17fb7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.096242 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/58ab4a40-4a69-4505-b60e-32b8b62eaeb5-srv-cert\") pod \"olm-operator-6b444d44fb-b856h\" (UID: \"58ab4a40-4a69-4505-b60e-32b8b62eaeb5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.096408 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.097621 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40342360-13dd-4953-805b-354528d0879d-metrics-tls\") pod \"dns-operator-744455d44c-5lrcf\" (UID: \"40342360-13dd-4953-805b-354528d0879d\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.097792 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5b9eed3-8130-46f7-9418-caa829997f64-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6fx7w\" (UID: \"e5b9eed3-8130-46f7-9418-caa829997f64\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.099408 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/58ab4a40-4a69-4505-b60e-32b8b62eaeb5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-b856h\" (UID: \"58ab4a40-4a69-4505-b60e-32b8b62eaeb5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.104980 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqvx8\" (UniqueName: \"kubernetes.io/projected/60f101b6-dee6-41af-8943-cd8ebfd1d528-kube-api-access-kqvx8\") pod \"controller-manager-879f6c89f-cmjlm\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.154201 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjq4n\" (UniqueName: \"kubernetes.io/projected/e5b9eed3-8130-46f7-9418-caa829997f64-kube-api-access-vjq4n\") pod \"cluster-samples-operator-665b6dd947-6fx7w\" (UID: \"e5b9eed3-8130-46f7-9418-caa829997f64\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.177775 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsh8c\" (UniqueName: \"kubernetes.io/projected/02ca41da-9f6c-4432-9041-32e3aeba0e92-kube-api-access-fsh8c\") pod \"apiserver-7bbb656c7d-h6x29\" (UID: \"02ca41da-9f6c-4432-9041-32e3aeba0e92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.184436 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qdv\" (UniqueName: \"kubernetes.io/projected/462e2be7-576b-4077-8e50-13b60aafa1bb-kube-api-access-46qdv\") pod \"openshift-controller-manager-operator-756b6f6bc6-66nv9\" (UID: \"462e2be7-576b-4077-8e50-13b60aafa1bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.189814 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.189965 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a87e683d-1f76-40b8-bfeb-b06076224893-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-d5zmz\" (UID: \"a87e683d-1f76-40b8-bfeb-b06076224893\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.189986 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/faad5b47-a113-424c-bba9-a681a4107f98-signing-cabundle\") pod \"service-ca-9c57cc56f-m5pf4\" (UID: \"faad5b47-a113-424c-bba9-a681a4107f98\") " pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190003 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e526afb-63d8-4825-87c5-d039c0b81aeb-config-volume\") pod \"dns-default-l6vlk\" (UID: \"8e526afb-63d8-4825-87c5-d039c0b81aeb\") " pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190026 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gd7h\" (UniqueName: \"kubernetes.io/projected/78ef8aa8-69b1-43ae-844a-9b3bed415a4a-kube-api-access-4gd7h\") pod \"ingress-canary-8xwb5\" (UID: \"78ef8aa8-69b1-43ae-844a-9b3bed415a4a\") " pod="openshift-ingress-canary/ingress-canary-8xwb5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190041 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjcw\" (UniqueName: \"kubernetes.io/projected/e3384716-8ea8-411a-a4e3-a50ec1cf6790-kube-api-access-qqjcw\") pod \"migrator-59844c95c7-j6d5m\" (UID: \"e3384716-8ea8-411a-a4e3-a50ec1cf6790\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190061 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q5bt5\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190081 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ef8f5c-e4a0-41be-ac66-47ded9a4fc52-serving-cert\") pod \"service-ca-operator-777779d784-4lz8p\" (UID: \"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190099 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgll6\" (UniqueName: \"kubernetes.io/projected/45c25ebf-9993-4c4d-843b-5084afce8cfa-kube-api-access-dgll6\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190124 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0912be1c-00d6-47fb-84fa-58b6569ea434-metrics-certs\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190139 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5689bcf3-2722-4d26-8ee2-ebccfa61da08-webhook-cert\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190156 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xpj\" (UniqueName: \"kubernetes.io/projected/02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c-kube-api-access-s8xpj\") pod \"multus-admission-controller-857f4d67dd-tgmnk\" (UID: \"02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190172 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdpss\" (UniqueName: \"kubernetes.io/projected/571e8f9f-9662-4139-9cf5-51093519d329-kube-api-access-kdpss\") pod \"marketplace-operator-79b997595-q5bt5\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190188 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87e683d-1f76-40b8-bfeb-b06076224893-config\") pod \"kube-controller-manager-operator-78b949d7b-d5zmz\" (UID: \"a87e683d-1f76-40b8-bfeb-b06076224893\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190204 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d37daf8-ad4f-4531-b2e7-04adeda4de89-profile-collector-cert\") pod \"catalog-operator-68c6474976-qlp92\" (UID: \"6d37daf8-ad4f-4531-b2e7-04adeda4de89\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190220 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29djb\" (UniqueName: \"kubernetes.io/projected/89e4a0f1-ceb6-41f9-ab80-e397d4962f59-kube-api-access-29djb\") pod \"machine-config-server-hmn5s\" (UID: \"89e4a0f1-ceb6-41f9-ab80-e397d4962f59\") " pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190236 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-csi-data-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190255 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q5bt5\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190269 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7d11297-76f5-4bdd-a744-57ad6376de77-secret-volume\") pod \"collect-profiles-29323395-h867z\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190283 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78ef8aa8-69b1-43ae-844a-9b3bed415a4a-cert\") pod \"ingress-canary-8xwb5\" (UID: \"78ef8aa8-69b1-43ae-844a-9b3bed415a4a\") " pod="openshift-ingress-canary/ingress-canary-8xwb5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190314 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-mountpoint-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190331 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/89e4a0f1-ceb6-41f9-ab80-e397d4962f59-certs\") pod \"machine-config-server-hmn5s\" (UID: \"89e4a0f1-ceb6-41f9-ab80-e397d4962f59\") " pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190344 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-registration-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190367 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87e683d-1f76-40b8-bfeb-b06076224893-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-d5zmz\" (UID: \"a87e683d-1f76-40b8-bfeb-b06076224893\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190405 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/89e4a0f1-ceb6-41f9-ab80-e397d4962f59-node-bootstrap-token\") pod \"machine-config-server-hmn5s\" (UID: \"89e4a0f1-ceb6-41f9-ab80-e397d4962f59\") " pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190420 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ns7\" (UniqueName: \"kubernetes.io/projected/6d37daf8-ad4f-4531-b2e7-04adeda4de89-kube-api-access-s4ns7\") pod \"catalog-operator-68c6474976-qlp92\" (UID: \"6d37daf8-ad4f-4531-b2e7-04adeda4de89\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190440 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2af92ba0-9e7b-4a45-9513-70217f77a845-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g6jkm\" (UID: \"2af92ba0-9e7b-4a45-9513-70217f77a845\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190458 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/faad5b47-a113-424c-bba9-a681a4107f98-signing-key\") pod \"service-ca-9c57cc56f-m5pf4\" (UID: \"faad5b47-a113-424c-bba9-a681a4107f98\") " pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190474 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0912be1c-00d6-47fb-84fa-58b6569ea434-stats-auth\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190496 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fn4d\" (UniqueName: \"kubernetes.io/projected/2af92ba0-9e7b-4a45-9513-70217f77a845-kube-api-access-6fn4d\") pod \"package-server-manager-789f6589d5-g6jkm\" (UID: \"2af92ba0-9e7b-4a45-9513-70217f77a845\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190516 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5689bcf3-2722-4d26-8ee2-ebccfa61da08-tmpfs\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190535 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ef8f5c-e4a0-41be-ac66-47ded9a4fc52-config\") pod \"service-ca-operator-777779d784-4lz8p\" (UID: \"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190553 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62mk4\" (UniqueName: \"kubernetes.io/projected/c7d11297-76f5-4bdd-a744-57ad6376de77-kube-api-access-62mk4\") pod \"collect-profiles-29323395-h867z\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190576 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7d11297-76f5-4bdd-a744-57ad6376de77-config-volume\") pod \"collect-profiles-29323395-h867z\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190602 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0912be1c-00d6-47fb-84fa-58b6569ea434-default-certificate\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190618 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0912be1c-00d6-47fb-84fa-58b6569ea434-service-ca-bundle\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190633 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-plugins-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190652 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9414d700-2392-4e49-b703-8bcf624bdf60-proxy-tls\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190683 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwn95\" (UniqueName: \"kubernetes.io/projected/5689bcf3-2722-4d26-8ee2-ebccfa61da08-kube-api-access-gwn95\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190722 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5689bcf3-2722-4d26-8ee2-ebccfa61da08-apiservice-cert\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190762 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9414d700-2392-4e49-b703-8bcf624bdf60-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190786 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnbfv\" (UniqueName: \"kubernetes.io/projected/01ef8f5c-e4a0-41be-ac66-47ded9a4fc52-kube-api-access-mnbfv\") pod \"service-ca-operator-777779d784-4lz8p\" (UID: \"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190802 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zb9z\" (UniqueName: \"kubernetes.io/projected/0912be1c-00d6-47fb-84fa-58b6569ea434-kube-api-access-2zb9z\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190823 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvsc\" (UniqueName: \"kubernetes.io/projected/8e526afb-63d8-4825-87c5-d039c0b81aeb-kube-api-access-zxvsc\") pod \"dns-default-l6vlk\" (UID: \"8e526afb-63d8-4825-87c5-d039c0b81aeb\") " pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190839 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hfzd\" (UniqueName: \"kubernetes.io/projected/9414d700-2392-4e49-b703-8bcf624bdf60-kube-api-access-5hfzd\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190860 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e526afb-63d8-4825-87c5-d039c0b81aeb-metrics-tls\") pod \"dns-default-l6vlk\" (UID: \"8e526afb-63d8-4825-87c5-d039c0b81aeb\") " pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190875 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d37daf8-ad4f-4531-b2e7-04adeda4de89-srv-cert\") pod \"catalog-operator-68c6474976-qlp92\" (UID: \"6d37daf8-ad4f-4531-b2e7-04adeda4de89\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190890 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-664jz\" (UniqueName: \"kubernetes.io/projected/faad5b47-a113-424c-bba9-a681a4107f98-kube-api-access-664jz\") pod \"service-ca-9c57cc56f-m5pf4\" (UID: \"faad5b47-a113-424c-bba9-a681a4107f98\") " pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190907 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9414d700-2392-4e49-b703-8bcf624bdf60-images\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190922 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-socket-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.190936 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tgmnk\" (UID: \"02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.193897 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-socket-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.197099 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-plugins-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.197180 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5689bcf3-2722-4d26-8ee2-ebccfa61da08-tmpfs\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.198235 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ef8f5c-e4a0-41be-ac66-47ded9a4fc52-config\") pod \"service-ca-operator-777779d784-4lz8p\" (UID: \"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.198409 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0912be1c-00d6-47fb-84fa-58b6569ea434-service-ca-bundle\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.198620 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:13.698413698 +0000 UTC m=+154.589567345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.198723 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9414d700-2392-4e49-b703-8bcf624bdf60-images\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.199379 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e526afb-63d8-4825-87c5-d039c0b81aeb-config-volume\") pod \"dns-default-l6vlk\" (UID: \"8e526afb-63d8-4825-87c5-d039c0b81aeb\") " pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.199545 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/faad5b47-a113-424c-bba9-a681a4107f98-signing-cabundle\") pod \"service-ca-9c57cc56f-m5pf4\" (UID: \"faad5b47-a113-424c-bba9-a681a4107f98\") " pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.201412 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q5bt5\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.203540 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0912be1c-00d6-47fb-84fa-58b6569ea434-default-certificate\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.203717 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-csi-data-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.203815 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-mountpoint-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.203915 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7d11297-76f5-4bdd-a744-57ad6376de77-config-volume\") pod \"collect-profiles-29323395-h867z\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.204069 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/45c25ebf-9993-4c4d-843b-5084afce8cfa-registration-dir\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.204341 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9414d700-2392-4e49-b703-8bcf624bdf60-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.204975 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87e683d-1f76-40b8-bfeb-b06076224893-config\") pod \"kube-controller-manager-operator-78b949d7b-d5zmz\" (UID: \"a87e683d-1f76-40b8-bfeb-b06076224893\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.205455 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tgmnk\" (UID: \"02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.208182 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d37daf8-ad4f-4531-b2e7-04adeda4de89-profile-collector-cert\") pod \"catalog-operator-68c6474976-qlp92\" (UID: \"6d37daf8-ad4f-4531-b2e7-04adeda4de89\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.208345 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7d11297-76f5-4bdd-a744-57ad6376de77-secret-volume\") pod \"collect-profiles-29323395-h867z\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.209166 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9414d700-2392-4e49-b703-8bcf624bdf60-proxy-tls\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.209450 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q5bt5\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.210019 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/faad5b47-a113-424c-bba9-a681a4107f98-signing-key\") pod \"service-ca-9c57cc56f-m5pf4\" (UID: \"faad5b47-a113-424c-bba9-a681a4107f98\") " pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.210203 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/89e4a0f1-ceb6-41f9-ab80-e397d4962f59-node-bootstrap-token\") pod \"machine-config-server-hmn5s\" (UID: \"89e4a0f1-ceb6-41f9-ab80-e397d4962f59\") " pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.210241 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2af92ba0-9e7b-4a45-9513-70217f77a845-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g6jkm\" (UID: \"2af92ba0-9e7b-4a45-9513-70217f77a845\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.211061 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5689bcf3-2722-4d26-8ee2-ebccfa61da08-webhook-cert\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.213398 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78ef8aa8-69b1-43ae-844a-9b3bed415a4a-cert\") pod \"ingress-canary-8xwb5\" (UID: \"78ef8aa8-69b1-43ae-844a-9b3bed415a4a\") " pod="openshift-ingress-canary/ingress-canary-8xwb5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.213799 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87e683d-1f76-40b8-bfeb-b06076224893-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-d5zmz\" (UID: \"a87e683d-1f76-40b8-bfeb-b06076224893\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.213863 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5689bcf3-2722-4d26-8ee2-ebccfa61da08-apiservice-cert\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.214626 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e526afb-63d8-4825-87c5-d039c0b81aeb-metrics-tls\") pod \"dns-default-l6vlk\" (UID: \"8e526afb-63d8-4825-87c5-d039c0b81aeb\") " pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.214744 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ef8f5c-e4a0-41be-ac66-47ded9a4fc52-serving-cert\") pod \"service-ca-operator-777779d784-4lz8p\" (UID: \"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.214885 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/89e4a0f1-ceb6-41f9-ab80-e397d4962f59-certs\") pod \"machine-config-server-hmn5s\" (UID: \"89e4a0f1-ceb6-41f9-ab80-e397d4962f59\") " pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.215721 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d37daf8-ad4f-4531-b2e7-04adeda4de89-srv-cert\") pod \"catalog-operator-68c6474976-qlp92\" (UID: \"6d37daf8-ad4f-4531-b2e7-04adeda4de89\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.217926 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0912be1c-00d6-47fb-84fa-58b6569ea434-stats-auth\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.222378 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0912be1c-00d6-47fb-84fa-58b6569ea434-metrics-certs\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.228578 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b54e3fe-e025-45e2-bbf2-43f6ccadc773-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6qb9w\" (UID: \"8b54e3fe-e025-45e2-bbf2-43f6ccadc773\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.229489 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh58m\" (UniqueName: \"kubernetes.io/projected/5a599a59-e905-4c9a-9f4b-e4a11dce9ba4-kube-api-access-rh58m\") pod \"authentication-operator-69f744f599-lhc27\" (UID: \"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.250149 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jtrk\" (UniqueName: \"kubernetes.io/projected/40342360-13dd-4953-805b-354528d0879d-kube-api-access-6jtrk\") pod \"dns-operator-744455d44c-5lrcf\" (UID: \"40342360-13dd-4953-805b-354528d0879d\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.254275 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.269529 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19e9649a-4386-4922-9cac-a57b34aa4d2e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vn5bf\" (UID: \"19e9649a-4386-4922-9cac-a57b34aa4d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.272998 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.287779 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-bound-sa-token\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.292318 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.292977 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:13.79295418 +0000 UTC m=+154.684107747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.308225 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w274r\" (UniqueName: \"kubernetes.io/projected/a48a6ed4-aed1-433f-85d2-08e6beaea953-kube-api-access-w274r\") pod \"route-controller-manager-6576b87f9c-27792\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.316085 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.317501 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.326730 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8phzj\" (UniqueName: \"kubernetes.io/projected/4c784dcc-2a24-462a-aaf8-7c3cf4d2d588-kube-api-access-8phzj\") pod \"machine-config-controller-84d6567774-2ltpz\" (UID: \"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.346216 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcwf4\" (UniqueName: \"kubernetes.io/projected/58ab4a40-4a69-4505-b60e-32b8b62eaeb5-kube-api-access-rcwf4\") pod \"olm-operator-6b444d44fb-b856h\" (UID: \"58ab4a40-4a69-4505-b60e-32b8b62eaeb5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.364921 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.370038 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.372284 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv8xv\" (UniqueName: \"kubernetes.io/projected/fecb5f70-edd2-466b-a31f-25b1db79aec5-kube-api-access-pv8xv\") pod \"oauth-openshift-558db77b4-wvclq\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.376474 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.381534 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-47rr5"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.393802 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.393865 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz"] Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.394074 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:13.894049318 +0000 UTC m=+154.785202885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.394376 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.394805 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:13.894798394 +0000 UTC m=+154.785951961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.397142 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.404915 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.409331 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ns7\" (UniqueName: \"kubernetes.io/projected/6d37daf8-ad4f-4531-b2e7-04adeda4de89-kube-api-access-s4ns7\") pod \"catalog-operator-68c6474976-qlp92\" (UID: \"6d37daf8-ad4f-4531-b2e7-04adeda4de89\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: W1002 11:21:13.419390 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17feaa75_00bd_4b47_a857_5fa5b27427fb.slice/crio-b35a94eb8cc23f4ca155b9f917f0743040ce062ddb17415c906091aa13c9aa93 WatchSource:0}: Error finding container b35a94eb8cc23f4ca155b9f917f0743040ce062ddb17415c906091aa13c9aa93: Status 404 returned error can't find the container with id b35a94eb8cc23f4ca155b9f917f0743040ce062ddb17415c906091aa13c9aa93 Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.420985 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.423065 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwn95\" (UniqueName: \"kubernetes.io/projected/5689bcf3-2722-4d26-8ee2-ebccfa61da08-kube-api-access-gwn95\") pod \"packageserver-d55dfcdfc-rrdvc\" (UID: \"5689bcf3-2722-4d26-8ee2-ebccfa61da08\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: W1002 11:21:13.430669 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f36aa73_2c64_431b_8991_37312e054756.slice/crio-41f4bc0c43bb0180aaaf9936f6ca7ae695989a8d3d589389390091d0b6e94bfb WatchSource:0}: Error finding container 41f4bc0c43bb0180aaaf9936f6ca7ae695989a8d3d589389390091d0b6e94bfb: Status 404 returned error can't find the container with id 41f4bc0c43bb0180aaaf9936f6ca7ae695989a8d3d589389390091d0b6e94bfb Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.446586 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62mk4\" (UniqueName: \"kubernetes.io/projected/c7d11297-76f5-4bdd-a744-57ad6376de77-kube-api-access-62mk4\") pod \"collect-profiles-29323395-h867z\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.453679 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.459234 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.465391 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.472244 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.477476 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-664jz\" (UniqueName: \"kubernetes.io/projected/faad5b47-a113-424c-bba9-a681a4107f98-kube-api-access-664jz\") pod \"service-ca-9c57cc56f-m5pf4\" (UID: \"faad5b47-a113-424c-bba9-a681a4107f98\") " pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.481352 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.490050 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnbfv\" (UniqueName: \"kubernetes.io/projected/01ef8f5c-e4a0-41be-ac66-47ded9a4fc52-kube-api-access-mnbfv\") pod \"service-ca-operator-777779d784-4lz8p\" (UID: \"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.496109 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.496577 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:13.996556053 +0000 UTC m=+154.887709620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.509346 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.510083 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hfzd\" (UniqueName: \"kubernetes.io/projected/9414d700-2392-4e49-b703-8bcf624bdf60-kube-api-access-5hfzd\") pod \"machine-config-operator-74547568cd-fj9hh\" (UID: \"9414d700-2392-4e49-b703-8bcf624bdf60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.515951 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.517718 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cmjlm"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.533154 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zb9z\" (UniqueName: \"kubernetes.io/projected/0912be1c-00d6-47fb-84fa-58b6569ea434-kube-api-access-2zb9z\") pod \"router-default-5444994796-g4bk2\" (UID: \"0912be1c-00d6-47fb-84fa-58b6569ea434\") " pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.547825 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.550495 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fn4d\" (UniqueName: \"kubernetes.io/projected/2af92ba0-9e7b-4a45-9513-70217f77a845-kube-api-access-6fn4d\") pod \"package-server-manager-789f6589d5-g6jkm\" (UID: \"2af92ba0-9e7b-4a45-9513-70217f77a845\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.562554 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.562736 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.566055 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxvsc\" (UniqueName: \"kubernetes.io/projected/8e526afb-63d8-4825-87c5-d039c0b81aeb-kube-api-access-zxvsc\") pod \"dns-default-l6vlk\" (UID: \"8e526afb-63d8-4825-87c5-d039c0b81aeb\") " pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.576809 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.592274 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.592494 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.599143 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.599649 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.099626669 +0000 UTC m=+154.990780236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.615053 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xpj\" (UniqueName: \"kubernetes.io/projected/02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c-kube-api-access-s8xpj\") pod \"multus-admission-controller-857f4d67dd-tgmnk\" (UID: \"02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.622216 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.626310 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjcw\" (UniqueName: \"kubernetes.io/projected/e3384716-8ea8-411a-a4e3-a50ec1cf6790-kube-api-access-qqjcw\") pod \"migrator-59844c95c7-j6d5m\" (UID: \"e3384716-8ea8-411a-a4e3-a50ec1cf6790\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.644132 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a87e683d-1f76-40b8-bfeb-b06076224893-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-d5zmz\" (UID: \"a87e683d-1f76-40b8-bfeb-b06076224893\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.647736 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgll6\" (UniqueName: \"kubernetes.io/projected/45c25ebf-9993-4c4d-843b-5084afce8cfa-kube-api-access-dgll6\") pod \"csi-hostpathplugin-kbq7v\" (UID: \"45c25ebf-9993-4c4d-843b-5084afce8cfa\") " pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: W1002 11:21:13.661417 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f101b6_dee6_41af_8943_cd8ebfd1d528.slice/crio-7ff2337159a796d9c427a9407ff46fc6f0b2618b36507fbc2846de3cd6163c4e WatchSource:0}: Error finding container 7ff2337159a796d9c427a9407ff46fc6f0b2618b36507fbc2846de3cd6163c4e: Status 404 returned error can't find the container with id 7ff2337159a796d9c427a9407ff46fc6f0b2618b36507fbc2846de3cd6163c4e Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.664277 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gd7h\" (UniqueName: \"kubernetes.io/projected/78ef8aa8-69b1-43ae-844a-9b3bed415a4a-kube-api-access-4gd7h\") pod \"ingress-canary-8xwb5\" (UID: \"78ef8aa8-69b1-43ae-844a-9b3bed415a4a\") " pod="openshift-ingress-canary/ingress-canary-8xwb5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.701145 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.701416 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.201378989 +0000 UTC m=+155.092532596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.701507 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.701696 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.702094 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.202082933 +0000 UTC m=+155.093236500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.704183 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29djb\" (UniqueName: \"kubernetes.io/projected/89e4a0f1-ceb6-41f9-ab80-e397d4962f59-kube-api-access-29djb\") pod \"machine-config-server-hmn5s\" (UID: \"89e4a0f1-ceb6-41f9-ab80-e397d4962f59\") " pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.733264 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdpss\" (UniqueName: \"kubernetes.io/projected/571e8f9f-9662-4139-9cf5-51093519d329-kube-api-access-kdpss\") pod \"marketplace-operator-79b997595-q5bt5\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: W1002 11:21:13.747832 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ca41da_9f6c_4432_9041_32e3aeba0e92.slice/crio-9cbd9dfc5bed2358288d4187c0bb86abb4e9e72aa06e59230494c0a83b46d1f7 WatchSource:0}: Error finding container 9cbd9dfc5bed2358288d4187c0bb86abb4e9e72aa06e59230494c0a83b46d1f7: Status 404 returned error can't find the container with id 9cbd9dfc5bed2358288d4187c0bb86abb4e9e72aa06e59230494c0a83b46d1f7 Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.790360 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvclq"] Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.801108 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.804475 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.804585 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" event={"ID":"34322326-016a-4e58-b14c-680c8cc94dbb","Type":"ContainerStarted","Data":"07f2beb960eb56e8a13e6cbc9eae22fe09947d26783b22ebfc9dd69b48519abd"} Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.804731 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.304701133 +0000 UTC m=+155.195854690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.804794 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.805144 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.305134668 +0000 UTC m=+155.196288235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.829655 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.831476 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.835551 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.845308 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.864456 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w7rrv" event={"ID":"c670b59a-b4ec-4332-9a76-72fee4666277","Type":"ContainerStarted","Data":"d7ce7b0063e1bab847f83b8fb980683490f60d503799accc48a00109aa195936"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.864514 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w7rrv" event={"ID":"c670b59a-b4ec-4332-9a76-72fee4666277","Type":"ContainerStarted","Data":"5c334cd464d77e553ccbcc23b493c23892d3bd26dc389d51157a6ae3ea646758"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.865011 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-w7rrv" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.867253 4658 patch_prober.go:28] interesting pod/downloads-7954f5f757-w7rrv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.867326 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w7rrv" podUID="c670b59a-b4ec-4332-9a76-72fee4666277" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.871217 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" event={"ID":"4f36aa73-2c64-431b-8991-37312e054756","Type":"ContainerStarted","Data":"41f4bc0c43bb0180aaaf9936f6ca7ae695989a8d3d589389390091d0b6e94bfb"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.876429 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.893389 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-md7fr" event={"ID":"4082750e-cf12-45b4-8920-63f31ad1cc28","Type":"ContainerStarted","Data":"9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.893699 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-md7fr" event={"ID":"4082750e-cf12-45b4-8920-63f31ad1cc28","Type":"ContainerStarted","Data":"3021f14394484fd84cd612e756649c0d9db6ae2680dcc8adb20be503382eedbb"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.899433 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hmn5s" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.904836 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" event={"ID":"93963d75-dbb2-414c-9218-aee78bb8f819","Type":"ContainerStarted","Data":"a4953ab0cb63dd0c042b53c90b4165627454000322c48a89ff6c021596574b02"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.904900 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" event={"ID":"93963d75-dbb2-414c-9218-aee78bb8f819","Type":"ContainerStarted","Data":"3b5aeca486608cb3492772539f3f6010d7aa09abe8dd244b39d09ac860a3607c"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.905399 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:13 crc kubenswrapper[4658]: E1002 11:21:13.907623 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.407591472 +0000 UTC m=+155.298745039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.907772 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8xwb5" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.923177 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" event={"ID":"ebab7917-b306-46a8-8dc7-f99b4b162c71","Type":"ContainerStarted","Data":"7d8aaa14c0fc691a28803e35b47b92366117486659246c21df481958cc59390e"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.927971 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.947472 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" event={"ID":"17feaa75-00bd-4b47-a857-5fa5b27427fb","Type":"ContainerStarted","Data":"b35a94eb8cc23f4ca155b9f917f0743040ce062ddb17415c906091aa13c9aa93"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.966040 4658 patch_prober.go:28] interesting pod/console-operator-58897d9998-v4m9t container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.966259 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-v4m9t" podUID="0b1493cb-71c1-4283-9c81-b73014189a60" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.980039 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" event={"ID":"60f101b6-dee6-41af-8943-cd8ebfd1d528","Type":"ContainerStarted","Data":"7ff2337159a796d9c427a9407ff46fc6f0b2618b36507fbc2846de3cd6163c4e"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.980093 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" event={"ID":"2b161a36-8654-4948-8412-bb68940fe512","Type":"ContainerStarted","Data":"06492d16c89b1ab3f3c6d26f9d551e6c761b33623be006bbcf9d2c27ab206054"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.980116 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.980130 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" event={"ID":"2b161a36-8654-4948-8412-bb68940fe512","Type":"ContainerStarted","Data":"7f9e01717e90b8214e4ccb2c00e753b6d9805ded923b1f3c07d4a6617ab526b3"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.980141 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-v4m9t" event={"ID":"0b1493cb-71c1-4283-9c81-b73014189a60","Type":"ContainerStarted","Data":"1944e0d741ca8d5f25b23958d5d15f1ff3a654fd1fee837a0334d20f4383cecf"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.980151 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-v4m9t" event={"ID":"0b1493cb-71c1-4283-9c81-b73014189a60","Type":"ContainerStarted","Data":"5b34624e299e8b26cc8a07558fb4f4bb0e16fac0a776c9c98bf792b7d7c9ae00"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.980161 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" event={"ID":"2670838d-90c6-490a-a620-676073872108","Type":"ContainerStarted","Data":"64ba32c362961ac7b4d3bc447550448feb10f1b1f6e796475d77db83b6892ada"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.980174 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" event={"ID":"2670838d-90c6-490a-a620-676073872108","Type":"ContainerStarted","Data":"c50dd07b0e2f020814c18565c8e5c708b9e3fed533c707e2b4645d52675830c9"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.988661 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" event={"ID":"dd736d13-0140-458a-bbdf-bed6d2e55ce1","Type":"ContainerStarted","Data":"c9165a1cb318e783597f07e9c271858102accde1748791d9d61382817e90ba2b"} Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.988701 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:13 crc kubenswrapper[4658]: I1002 11:21:13.995768 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" event={"ID":"02ca41da-9f6c-4432-9041-32e3aeba0e92","Type":"ContainerStarted","Data":"9cbd9dfc5bed2358288d4187c0bb86abb4e9e72aa06e59230494c0a83b46d1f7"} Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.012849 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.013787 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.513772315 +0000 UTC m=+155.404925882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.048468 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.065736 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" event={"ID":"a583e679-8e90-4f82-b286-3eda40831c72","Type":"ContainerStarted","Data":"36e6e4640970f3694ba07e5c985a2934a63d7a1016c339b9a9931d4e5322ae4a"} Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.065789 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" event={"ID":"a583e679-8e90-4f82-b286-3eda40831c72","Type":"ContainerStarted","Data":"d97647eb7bbda9f1c55c5dda2a4da6f11f44b3e9b1f7f0edb3765b9cfb1a719d"} Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.113958 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.114778 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.614758397 +0000 UTC m=+155.505911964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.215207 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.215865 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.715847685 +0000 UTC m=+155.607001252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.219521 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5lrcf"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.254487 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lhc27"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.323358 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.325667 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.825641431 +0000 UTC m=+155.716794998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.326516 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.333686 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.352527 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.428395 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.428706 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:14.928691475 +0000 UTC m=+155.819845042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.465507 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-w7rrv" podStartSLOduration=133.465484157 podStartE2EDuration="2m13.465484157s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:14.462692972 +0000 UTC m=+155.353846569" watchObservedRunningTime="2025-10-02 11:21:14.465484157 +0000 UTC m=+155.356637724" Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.473220 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m5pf4"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.529578 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.529950 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.029931638 +0000 UTC m=+155.921085205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.574758 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w"] Oct 02 11:21:14 crc kubenswrapper[4658]: W1002 11:21:14.595098 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaad5b47_a113_424c_bba9_a681a4107f98.slice/crio-b7037726bf9d967b3fbcb11637d673f42be6184eb6c2b32d42488730b37ad99d WatchSource:0}: Error finding container b7037726bf9d967b3fbcb11637d673f42be6184eb6c2b32d42488730b37ad99d: Status 404 returned error can't find the container with id b7037726bf9d967b3fbcb11637d673f42be6184eb6c2b32d42488730b37ad99d Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.631238 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.631644 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.131625896 +0000 UTC m=+156.022779463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.631647 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" podStartSLOduration=133.631622486 podStartE2EDuration="2m13.631622486s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:14.594355638 +0000 UTC m=+155.485509205" watchObservedRunningTime="2025-10-02 11:21:14.631622486 +0000 UTC m=+155.522776063" Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.632022 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l6vlk"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.634826 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.635107 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f2wf8" podStartSLOduration=133.635097546 podStartE2EDuration="2m13.635097546s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:14.631891336 +0000 UTC m=+155.523044913" watchObservedRunningTime="2025-10-02 11:21:14.635097546 +0000 UTC m=+155.526251113" Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.723067 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-v4m9t" podStartSLOduration=133.723046842 podStartE2EDuration="2m13.723046842s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:14.664761243 +0000 UTC m=+155.555914810" watchObservedRunningTime="2025-10-02 11:21:14.723046842 +0000 UTC m=+155.614200409" Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.731942 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.732318 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.232275719 +0000 UTC m=+156.123429286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.740979 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4wktl" podStartSLOduration=132.740960037 podStartE2EDuration="2m12.740960037s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:14.740676417 +0000 UTC m=+155.631829984" watchObservedRunningTime="2025-10-02 11:21:14.740960037 +0000 UTC m=+155.632113604" Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.761880 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.785412 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.799331 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.853051 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.873881 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.874281 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.374261128 +0000 UTC m=+156.265414695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:14 crc kubenswrapper[4658]: W1002 11:21:14.876038 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c784dcc_2a24_462a_aaf8_7c3cf4d2d588.slice/crio-f7d7890b097baac61abd92b5a44b5799484ca3a8e1d0732f04fdad62dc3ea88d WatchSource:0}: Error finding container f7d7890b097baac61abd92b5a44b5799484ca3a8e1d0732f04fdad62dc3ea88d: Status 404 returned error can't find the container with id f7d7890b097baac61abd92b5a44b5799484ca3a8e1d0732f04fdad62dc3ea88d Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.950765 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.971550 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz"] Oct 02 11:21:14 crc kubenswrapper[4658]: I1002 11:21:14.980014 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:14 crc kubenswrapper[4658]: E1002 11:21:14.980335 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.480306136 +0000 UTC m=+156.371459703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: W1002 11:21:15.027075 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9414d700_2392_4e49_b703_8bcf624bdf60.slice/crio-1f3b5b0fae74cb0117ad40d79b1b1423f3b5585c8b08e8e90e866f364dbc3299 WatchSource:0}: Error finding container 1f3b5b0fae74cb0117ad40d79b1b1423f3b5585c8b08e8e90e866f364dbc3299: Status 404 returned error can't find the container with id 1f3b5b0fae74cb0117ad40d79b1b1423f3b5585c8b08e8e90e866f364dbc3299 Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.045550 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm"] Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.071173 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-md7fr" podStartSLOduration=134.071115491 podStartE2EDuration="2m14.071115491s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.070453168 +0000 UTC m=+155.961606755" watchObservedRunningTime="2025-10-02 11:21:15.071115491 +0000 UTC m=+155.962269078" Oct 02 11:21:15 crc kubenswrapper[4658]: W1002 11:21:15.078672 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af92ba0_9e7b_4a45_9513_70217f77a845.slice/crio-4d012f8634b16c182c0e19f9e796f8ba7f143eec420ce3f0e8a8867172038d62 WatchSource:0}: Error finding container 4d012f8634b16c182c0e19f9e796f8ba7f143eec420ce3f0e8a8867172038d62: Status 404 returned error can't find the container with id 4d012f8634b16c182c0e19f9e796f8ba7f143eec420ce3f0e8a8867172038d62 Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.084637 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.085267 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.585250186 +0000 UTC m=+156.476403753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.085598 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tgmnk"] Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.104361 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8xwb5"] Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.120525 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kbq7v"] Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.126832 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q5bt5"] Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.146578 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" event={"ID":"19e9649a-4386-4922-9cac-a57b34aa4d2e","Type":"ContainerStarted","Data":"88f2bdfbbe4bd41daf4e86f722473ea1bcbaac02c9c0e53822f00257e21b72ba"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.149332 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" event={"ID":"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588","Type":"ContainerStarted","Data":"f7d7890b097baac61abd92b5a44b5799484ca3a8e1d0732f04fdad62dc3ea88d"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.153441 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" event={"ID":"9414d700-2392-4e49-b703-8bcf624bdf60","Type":"ContainerStarted","Data":"1f3b5b0fae74cb0117ad40d79b1b1423f3b5585c8b08e8e90e866f364dbc3299"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.162537 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" event={"ID":"a87e683d-1f76-40b8-bfeb-b06076224893","Type":"ContainerStarted","Data":"a5dbef4cde86bb296a7de6ee29965fbb13bb541228a7d7f21b294c829bbae08e"} Oct 02 11:21:15 crc kubenswrapper[4658]: W1002 11:21:15.166013 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ef8aa8_69b1_43ae_844a_9b3bed415a4a.slice/crio-660d5408729bce344937849fbd6b92a79756ac40474ff20d7473da33d974d8b9 WatchSource:0}: Error finding container 660d5408729bce344937849fbd6b92a79756ac40474ff20d7473da33d974d8b9: Status 404 returned error can't find the container with id 660d5408729bce344937849fbd6b92a79756ac40474ff20d7473da33d974d8b9 Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.173451 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" event={"ID":"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4","Type":"ContainerStarted","Data":"4298805e2d04c08d1cc512c15dc8c8a8528bd7db32b76692bfbb2609f3fbf46b"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.173514 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" event={"ID":"5a599a59-e905-4c9a-9f4b-e4a11dce9ba4","Type":"ContainerStarted","Data":"326b5d7ccaa99fc15385b1b86750b1aec5ac2d3b6ff19265f6a888e8d4b1057d"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.187163 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.187744 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.68771792 +0000 UTC m=+156.578871497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.198467 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" event={"ID":"4f36aa73-2c64-431b-8991-37312e054756","Type":"ContainerStarted","Data":"3fd6c03ab5d0d167b3e54b0589fefe3bdb504198768d03c8ce153c0052da11c1"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.198525 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" event={"ID":"4f36aa73-2c64-431b-8991-37312e054756","Type":"ContainerStarted","Data":"f7df2dcdf52cff619257827ece93047cb39667dd33adb525964560490fd381c1"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.244759 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-gjt96" podStartSLOduration=133.244736306 podStartE2EDuration="2m13.244736306s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.240817961 +0000 UTC m=+156.131971548" watchObservedRunningTime="2025-10-02 11:21:15.244736306 +0000 UTC m=+156.135889873" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.258076 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m"] Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.258459 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" event={"ID":"e5b9eed3-8130-46f7-9418-caa829997f64","Type":"ContainerStarted","Data":"4794f7fa2e1df1e97d51535e5bbddf937ca39cab965ff60a5cee23c156207ad3"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.258491 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" event={"ID":"e5b9eed3-8130-46f7-9418-caa829997f64","Type":"ContainerStarted","Data":"18ada0dd076a5e5b87b2fd04f6b37ca421ed9d2a975725c8f9c2b232fb0a89db"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.266722 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" event={"ID":"60f101b6-dee6-41af-8943-cd8ebfd1d528","Type":"ContainerStarted","Data":"b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.268135 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.290682 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" event={"ID":"8b54e3fe-e025-45e2-bbf2-43f6ccadc773","Type":"ContainerStarted","Data":"36efef250976ef310a167e26a4aa9fdb91db8131a8ebcf1862b3614192fa9df6"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.292827 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.318719 4658 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cmjlm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.318786 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" podUID="60f101b6-dee6-41af-8943-cd8ebfd1d528" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.326753 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hmn5s" event={"ID":"89e4a0f1-ceb6-41f9-ab80-e397d4962f59","Type":"ContainerStarted","Data":"c7ffa8ba0193794b400515dab4b59f57dca7f0cd990346f8d380a3fb36a1baab"} Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.342427 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.842392815 +0000 UTC m=+156.733546392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.342687 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l6vlk" event={"ID":"8e526afb-63d8-4825-87c5-d039c0b81aeb","Type":"ContainerStarted","Data":"ce86df106c32fe8ff5eb1d0fa14a53530f85b4474134303bf7513c8fcdaddd9f"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.345666 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" event={"ID":"c7d11297-76f5-4bdd-a744-57ad6376de77","Type":"ContainerStarted","Data":"4f76ac8b12a37a64db991f736345aa34b1bebe5591d13173c4b52c1ee5047dc9"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.365463 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" event={"ID":"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52","Type":"ContainerStarted","Data":"c222b60c765cdd9985106899f6b9354ffd6eb518f53cf6d1494b9abca1d2c09d"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.370117 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" podStartSLOduration=134.370088515 podStartE2EDuration="2m14.370088515s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.319464799 +0000 UTC m=+156.210618376" watchObservedRunningTime="2025-10-02 11:21:15.370088515 +0000 UTC m=+156.261242082" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.389600 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" event={"ID":"fecb5f70-edd2-466b-a31f-25b1db79aec5","Type":"ContainerStarted","Data":"b1dfeda52893d1d544ddc89cd1af4f0984c3fe3c578f0ae7bbe564efe7e26f33"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.389667 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" event={"ID":"fecb5f70-edd2-466b-a31f-25b1db79aec5","Type":"ContainerStarted","Data":"e11c8d2d98441577cdc5203468750ca9a69678ba385d41969073d8f00cfcd74e"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.392859 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.393672 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.393933 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.893898812 +0000 UTC m=+156.785052409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.394158 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.395699 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.895681603 +0000 UTC m=+156.786835240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.406552 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" event={"ID":"40342360-13dd-4953-805b-354528d0879d","Type":"ContainerStarted","Data":"8723451012ecfbf4e58f1e5b05998b38cda31f81bb84f7c943c29ba105468029"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.418246 4658 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wvclq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.418647 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" podUID="fecb5f70-edd2-466b-a31f-25b1db79aec5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.431606 4658 generic.go:334] "Generic (PLEG): container finished" podID="02ca41da-9f6c-4432-9041-32e3aeba0e92" containerID="ae5e1d399cecd40b1c1367f177ff99428793cd4724065cc439a58c231c575ec3" exitCode=0 Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.432086 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" event={"ID":"02ca41da-9f6c-4432-9041-32e3aeba0e92","Type":"ContainerDied","Data":"ae5e1d399cecd40b1c1367f177ff99428793cd4724065cc439a58c231c575ec3"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.469950 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" event={"ID":"58ab4a40-4a69-4505-b60e-32b8b62eaeb5","Type":"ContainerStarted","Data":"f4f653961e1523f2bff4fc897f4c9e293821553cf1075e8c6d979f0a867d8670"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.470911 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.471985 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" podStartSLOduration=134.4719694 podStartE2EDuration="2m14.4719694s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.471234565 +0000 UTC m=+156.362388142" watchObservedRunningTime="2025-10-02 11:21:15.4719694 +0000 UTC m=+156.363122967" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.479496 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" event={"ID":"6d37daf8-ad4f-4531-b2e7-04adeda4de89","Type":"ContainerStarted","Data":"c57d5c4a5d89cc862a05e2dfe17d084d0ae35c09dd4f006c6dddd7809c4b08bd"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.480421 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.484249 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" event={"ID":"5689bcf3-2722-4d26-8ee2-ebccfa61da08","Type":"ContainerStarted","Data":"234721f9b808155dfd1cf8f83dc34234b30638e52af44b59cbc867a0cf52a70e"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.486850 4658 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qlp92 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.486905 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" podUID="6d37daf8-ad4f-4531-b2e7-04adeda4de89" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.489200 4658 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-b856h container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.489234 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" podUID="58ab4a40-4a69-4505-b60e-32b8b62eaeb5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.495740 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.496135 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.996101437 +0000 UTC m=+156.887255004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.496894 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.497395 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:15.997375532 +0000 UTC m=+156.888529099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.511185 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-g4bk2" event={"ID":"0912be1c-00d6-47fb-84fa-58b6569ea434","Type":"ContainerStarted","Data":"ecd1420aa79400f9556d17e4438c8db1d4b3511a064231b7e8e12c1f331759a9"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.511264 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-g4bk2" event={"ID":"0912be1c-00d6-47fb-84fa-58b6569ea434","Type":"ContainerStarted","Data":"0fb79deb7bff010900f89206ff4cd8c42393078ae77b9f3199ebc31889a8e005"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.519744 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" event={"ID":"34322326-016a-4e58-b14c-680c8cc94dbb","Type":"ContainerStarted","Data":"e22deffd3941a4fd444f04283ab2d356554267a42d59f8a7ced64cc659b40dab"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.530929 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" event={"ID":"17feaa75-00bd-4b47-a857-5fa5b27427fb","Type":"ContainerStarted","Data":"c9bcb3ce4a7784dec50cf4fd026e66b4fab90b192de73cb9bca5122f0213521a"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.543894 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" event={"ID":"a48a6ed4-aed1-433f-85d2-08e6beaea953","Type":"ContainerStarted","Data":"e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.543960 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" event={"ID":"a48a6ed4-aed1-433f-85d2-08e6beaea953","Type":"ContainerStarted","Data":"d681b7cdc1babc5a4fd0a961c1fb069ee7d621c45faae875446e238e30c7f9c8"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.545218 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.547739 4658 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-27792 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.547799 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" podUID="a48a6ed4-aed1-433f-85d2-08e6beaea953" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.550727 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hmn5s" podStartSLOduration=5.550711661 podStartE2EDuration="5.550711661s" podCreationTimestamp="2025-10-02 11:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.507171667 +0000 UTC m=+156.398325234" watchObservedRunningTime="2025-10-02 11:21:15.550711661 +0000 UTC m=+156.441865228" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.552737 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" podStartSLOduration=134.55272392 podStartE2EDuration="2m14.55272392s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.550270906 +0000 UTC m=+156.441424473" watchObservedRunningTime="2025-10-02 11:21:15.55272392 +0000 UTC m=+156.443877487" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.565206 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" event={"ID":"462e2be7-576b-4077-8e50-13b60aafa1bb","Type":"ContainerStarted","Data":"eeb62f7f32e765cf65521315637033952bab80e9e514c4658f164ca152fab2e8"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.565253 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" event={"ID":"462e2be7-576b-4077-8e50-13b60aafa1bb","Type":"ContainerStarted","Data":"2f318268dfc809ca3cd752c09cd78472800a545699fc0528474996499a40ae14"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.568804 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" event={"ID":"ebab7917-b306-46a8-8dc7-f99b4b162c71","Type":"ContainerStarted","Data":"d17bc423321d4e87e5f371a758b68ca6d82f7645df5a4ac092feec3fbd0e0bf3"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.572205 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" event={"ID":"2670838d-90c6-490a-a620-676073872108","Type":"ContainerStarted","Data":"1453f2ff1f6019c0a1b7c354296ac8c74965834a7fcf4e339a810dac320b081c"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.583045 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" event={"ID":"faad5b47-a113-424c-bba9-a681a4107f98","Type":"ContainerStarted","Data":"b7037726bf9d967b3fbcb11637d673f42be6184eb6c2b32d42488730b37ad99d"} Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.586206 4658 patch_prober.go:28] interesting pod/downloads-7954f5f757-w7rrv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.586261 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w7rrv" podUID="c670b59a-b4ec-4332-9a76-72fee4666277" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.599505 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.601172 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.601229 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.601565 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.602947 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.102927212 +0000 UTC m=+156.994080779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.603925 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x64tz" podStartSLOduration=133.603900305 podStartE2EDuration="2m13.603900305s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.601344277 +0000 UTC m=+156.492497844" watchObservedRunningTime="2025-10-02 11:21:15.603900305 +0000 UTC m=+156.495053882" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.606735 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-v4m9t" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.609780 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j267v" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.680282 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-lhc27" podStartSLOduration=134.680264684 podStartE2EDuration="2m14.680264684s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.677973246 +0000 UTC m=+156.569126813" watchObservedRunningTime="2025-10-02 11:21:15.680264684 +0000 UTC m=+156.571418241" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.709684 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.736608 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.236585286 +0000 UTC m=+157.127738923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.756507 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" podStartSLOduration=133.756486289 podStartE2EDuration="2m13.756486289s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.75418905 +0000 UTC m=+156.645342617" watchObservedRunningTime="2025-10-02 11:21:15.756486289 +0000 UTC m=+156.647639856" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.812986 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.813665 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.31364289 +0000 UTC m=+157.204796457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.837411 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" podStartSLOduration=133.837380043 podStartE2EDuration="2m13.837380043s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.814907852 +0000 UTC m=+156.706061419" watchObservedRunningTime="2025-10-02 11:21:15.837380043 +0000 UTC m=+156.728533610" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.882211 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4mdfm" podStartSLOduration=134.88219027 podStartE2EDuration="2m14.88219027s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.86438043 +0000 UTC m=+156.755533987" watchObservedRunningTime="2025-10-02 11:21:15.88219027 +0000 UTC m=+156.773343837" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.883159 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" podStartSLOduration=133.883151563 podStartE2EDuration="2m13.883151563s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.840450019 +0000 UTC m=+156.731603596" watchObservedRunningTime="2025-10-02 11:21:15.883151563 +0000 UTC m=+156.774305130" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.917440 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:15 crc kubenswrapper[4658]: E1002 11:21:15.917927 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.417912865 +0000 UTC m=+157.309066422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.933376 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-66nv9" podStartSLOduration=134.933348905 podStartE2EDuration="2m14.933348905s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.897723623 +0000 UTC m=+156.788877190" watchObservedRunningTime="2025-10-02 11:21:15.933348905 +0000 UTC m=+156.824502482" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.935138 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-47rr5" podStartSLOduration=133.935126976 podStartE2EDuration="2m13.935126976s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.926573402 +0000 UTC m=+156.817726969" watchObservedRunningTime="2025-10-02 11:21:15.935126976 +0000 UTC m=+156.826280543" Oct 02 11:21:15 crc kubenswrapper[4658]: I1002 11:21:15.951277 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" podStartSLOduration=133.951254729 podStartE2EDuration="2m13.951254729s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:15.949859241 +0000 UTC m=+156.841012808" watchObservedRunningTime="2025-10-02 11:21:15.951254729 +0000 UTC m=+156.842408286" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.023738 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.024002 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.523962093 +0000 UTC m=+157.415115660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.024308 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.024666 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.524659097 +0000 UTC m=+157.415812664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.057987 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-g4bk2" podStartSLOduration=134.057965639 podStartE2EDuration="2m14.057965639s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.027642089 +0000 UTC m=+156.918795656" watchObservedRunningTime="2025-10-02 11:21:16.057965639 +0000 UTC m=+156.949119206" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.079799 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgmpv" podStartSLOduration=134.079782628 podStartE2EDuration="2m14.079782628s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.078906198 +0000 UTC m=+156.970059765" watchObservedRunningTime="2025-10-02 11:21:16.079782628 +0000 UTC m=+156.970936195" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.126825 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.127414 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.627397081 +0000 UTC m=+157.518550638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.149106 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chn4z" podStartSLOduration=135.149089535 podStartE2EDuration="2m15.149089535s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.105475269 +0000 UTC m=+156.996628836" watchObservedRunningTime="2025-10-02 11:21:16.149089535 +0000 UTC m=+157.040243102" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.228323 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.228682 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.728667894 +0000 UTC m=+157.619821471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.329880 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.335550 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.835467557 +0000 UTC m=+157.726621224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.432130 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.433025 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:16.933005613 +0000 UTC m=+157.824159180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.533397 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.533992 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.033958325 +0000 UTC m=+157.925111892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.534517 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.535044 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.035035183 +0000 UTC m=+157.926188750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.608864 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" event={"ID":"571e8f9f-9662-4139-9cf5-51093519d329","Type":"ContainerStarted","Data":"d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.608919 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" event={"ID":"571e8f9f-9662-4139-9cf5-51093519d329","Type":"ContainerStarted","Data":"5ebc857c1bbf55c9940f46beb8849aa4a4e146d79dfab92168ef0551ebd5d716"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.609080 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.623442 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:21:16 crc kubenswrapper[4658]: [-]has-synced failed: reason withheld Oct 02 11:21:16 crc kubenswrapper[4658]: [+]process-running ok Oct 02 11:21:16 crc kubenswrapper[4658]: healthz check failed Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.623508 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.629048 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" event={"ID":"02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c","Type":"ContainerStarted","Data":"d0aa91785a9eaded40f61858296158d7cd7e6cbb3f865b7c65f6d2eb44e2f4c0"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.629093 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" event={"ID":"02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c","Type":"ContainerStarted","Data":"c0fbe7d76a4d274a20c671dfacdcd40858da8af81ac684d0332ff5dc0c976916"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.630905 4658 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-q5bt5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.630951 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" podUID="571e8f9f-9662-4139-9cf5-51093519d329" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.637186 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" event={"ID":"58ab4a40-4a69-4505-b60e-32b8b62eaeb5","Type":"ContainerStarted","Data":"1480bc841cdc35c24dd6c580a55a3049cd2aec2396aeb6f480e24cfc3e349857"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.638352 4658 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-b856h container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.638750 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" podUID="58ab4a40-4a69-4505-b60e-32b8b62eaeb5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.643255 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.643717 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.14369236 +0000 UTC m=+158.034845927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.648933 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" event={"ID":"8b54e3fe-e025-45e2-bbf2-43f6ccadc773","Type":"ContainerStarted","Data":"f6b89616faed913c0f442afad8d3535c7d90d8deab2021bc432da421e92e20a3"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.661451 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.661613 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.666375 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qb9w" podStartSLOduration=134.666355527 podStartE2EDuration="2m14.666355527s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.664498414 +0000 UTC m=+157.555651981" watchObservedRunningTime="2025-10-02 11:21:16.666355527 +0000 UTC m=+157.557509104" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.667288 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" event={"ID":"01ef8f5c-e4a0-41be-ac66-47ded9a4fc52","Type":"ContainerStarted","Data":"cfb9f30fbabe99602e998f2d857341e6676fa1590950abc59d8f07285290c140"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.668494 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" event={"ID":"45c25ebf-9993-4c4d-843b-5084afce8cfa","Type":"ContainerStarted","Data":"3c4603ee574723cf27b3eb9c24eeafdaa1f667c9cde543cdffc9c4aa8526283b"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.668692 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" podStartSLOduration=134.668683047 podStartE2EDuration="2m14.668683047s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.642647824 +0000 UTC m=+157.533801401" watchObservedRunningTime="2025-10-02 11:21:16.668683047 +0000 UTC m=+157.559836614" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.676229 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" event={"ID":"5689bcf3-2722-4d26-8ee2-ebccfa61da08","Type":"ContainerStarted","Data":"f1c29405690044e2be0843e5d9c2a099f827d7786960cf13ef80a37b123decd8"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.677153 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.679848 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" event={"ID":"19e9649a-4386-4922-9cac-a57b34aa4d2e","Type":"ContainerStarted","Data":"75d84f293d9be873f65c9b015258383144cd232173eb99d8c135964214a751ee"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.685652 4658 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rrdvc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.685721 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" podUID="5689bcf3-2722-4d26-8ee2-ebccfa61da08" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.694613 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" event={"ID":"9414d700-2392-4e49-b703-8bcf624bdf60","Type":"ContainerStarted","Data":"740a8d4942a5d458d5a70325b48a2595b2bb589f7960fbd2347ef90a74988f99"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.694672 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" event={"ID":"9414d700-2392-4e49-b703-8bcf624bdf60","Type":"ContainerStarted","Data":"c16863f1c6fbfb5f30132a494e2fb82ba8d98a615f5768346b060a64cdd97d74"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.697852 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4lz8p" podStartSLOduration=134.697834547 podStartE2EDuration="2m14.697834547s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.694849434 +0000 UTC m=+157.586003011" watchObservedRunningTime="2025-10-02 11:21:16.697834547 +0000 UTC m=+157.588988104" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.719682 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" event={"ID":"2af92ba0-9e7b-4a45-9513-70217f77a845","Type":"ContainerStarted","Data":"dcdde387048a41632fd6d399c7ff1bfc001c7c732fda85fb783ac014e424c1fe"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.719742 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" event={"ID":"2af92ba0-9e7b-4a45-9513-70217f77a845","Type":"ContainerStarted","Data":"15197260fc5bece516d9b4924b4b9ec26dd2f8f121c49b45937076a3a8e25ef0"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.719757 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" event={"ID":"2af92ba0-9e7b-4a45-9513-70217f77a845","Type":"ContainerStarted","Data":"4d012f8634b16c182c0e19f9e796f8ba7f143eec420ce3f0e8a8867172038d62"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.720556 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.729867 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vn5bf" podStartSLOduration=134.729843204 podStartE2EDuration="2m14.729843204s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.724808872 +0000 UTC m=+157.615962469" watchObservedRunningTime="2025-10-02 11:21:16.729843204 +0000 UTC m=+157.620996771" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.734324 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m" event={"ID":"e3384716-8ea8-411a-a4e3-a50ec1cf6790","Type":"ContainerStarted","Data":"1ebc27561f192941dc7065b0a12b0583542283abd8d7379f5cdc84fb21a39982"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.734379 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m" event={"ID":"e3384716-8ea8-411a-a4e3-a50ec1cf6790","Type":"ContainerStarted","Data":"48a02cfe9bf14dbf0f9bce3579302545ef28138d97d10bef7cf4ea1bb9a651de"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.734392 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m" event={"ID":"e3384716-8ea8-411a-a4e3-a50ec1cf6790","Type":"ContainerStarted","Data":"4e7b81ebb76e59e2b12882c79038c95d6e22b57469f5dc5d741cf5e6c3e01a1c"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.736269 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" event={"ID":"6d37daf8-ad4f-4531-b2e7-04adeda4de89","Type":"ContainerStarted","Data":"530a20fab920ec1edb4c9cb21fd60aebad5fce5b8adebf25d13156eebc12c8ea"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.737194 4658 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qlp92 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.737240 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" podUID="6d37daf8-ad4f-4531-b2e7-04adeda4de89" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.743381 4658 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xbkft container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]log ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]etcd ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/max-in-flight-filter ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 02 11:21:16 crc kubenswrapper[4658]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 02 11:21:16 crc kubenswrapper[4658]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/project.openshift.io-projectcache ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/openshift.io-startinformers ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 02 11:21:16 crc kubenswrapper[4658]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 11:21:16 crc kubenswrapper[4658]: livez check failed Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.743454 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" podUID="93963d75-dbb2-414c-9218-aee78bb8f819" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.746828 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.747175 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.247157578 +0000 UTC m=+158.138311135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.771515 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fj9hh" podStartSLOduration=134.771489413 podStartE2EDuration="2m14.771489413s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.766115778 +0000 UTC m=+157.657269375" watchObservedRunningTime="2025-10-02 11:21:16.771489413 +0000 UTC m=+157.662642980" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.801981 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" event={"ID":"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588","Type":"ContainerStarted","Data":"d27fa553d9a4392a0b3536a5b984e3668a013d471b96e53a01124540cd4f5bff"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.802339 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" event={"ID":"4c784dcc-2a24-462a-aaf8-7c3cf4d2d588","Type":"ContainerStarted","Data":"fb2643668d7068833ce7b68374bd6235ca724f571f787e0bdb0ee94f66513969"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.805503 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" podStartSLOduration=134.805485099 podStartE2EDuration="2m14.805485099s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.804854808 +0000 UTC m=+157.696008375" watchObservedRunningTime="2025-10-02 11:21:16.805485099 +0000 UTC m=+157.696638666" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.844695 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m5pf4" event={"ID":"faad5b47-a113-424c-bba9-a681a4107f98","Type":"ContainerStarted","Data":"280140309e99ee808695ac010be1d4627068fdc8a3da819a56419cc60a4f811e"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.851549 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.852915 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.352884074 +0000 UTC m=+158.244037661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.860120 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" event={"ID":"40342360-13dd-4953-805b-354528d0879d","Type":"ContainerStarted","Data":"270fef9d1d03d6f6ee14b7fbbe88febc1399b59173708f439af0cef562511b23"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.869274 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j6d5m" podStartSLOduration=134.869249167 podStartE2EDuration="2m14.869249167s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.832801476 +0000 UTC m=+157.723955043" watchObservedRunningTime="2025-10-02 11:21:16.869249167 +0000 UTC m=+157.760402744" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.869414 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2ltpz" podStartSLOduration=134.869408442 podStartE2EDuration="2m14.869408442s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.867673172 +0000 UTC m=+157.758826739" watchObservedRunningTime="2025-10-02 11:21:16.869408442 +0000 UTC m=+157.760562009" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.897625 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" event={"ID":"e5b9eed3-8130-46f7-9418-caa829997f64","Type":"ContainerStarted","Data":"e28fba4e8fddb4c04b17bfea67df7400834614369efb04402ab427d829d9cfbd"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.904246 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l6vlk" event={"ID":"8e526afb-63d8-4825-87c5-d039c0b81aeb","Type":"ContainerStarted","Data":"469de8735b13cea7d55c85611a23e0ded8a7074c3406bcb048be48f0ae0e1ba3"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.904324 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l6vlk" event={"ID":"8e526afb-63d8-4825-87c5-d039c0b81aeb","Type":"ContainerStarted","Data":"e9b1c3b9725f8e68c9e5a7b31110b5fc7025fdfd14d18430bf2489bc73174aec"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.905047 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.906247 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hmn5s" event={"ID":"89e4a0f1-ceb6-41f9-ab80-e397d4962f59","Type":"ContainerStarted","Data":"ad78a2224cba58fa6483c9d6f1b3a1e69c348c2f19e7d7f47206c5f504a418bc"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.907842 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" event={"ID":"c7d11297-76f5-4bdd-a744-57ad6376de77","Type":"ContainerStarted","Data":"e5ba68c0abb79fafa459abd278582b539aeb645727f24119389cb207a4c149cd"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.934897 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8xwb5" event={"ID":"78ef8aa8-69b1-43ae-844a-9b3bed415a4a","Type":"ContainerStarted","Data":"dd856d4d94f4afc7324b8b06aa320b6b528df6aacb6345a18c2b11f25d1e0973"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.934960 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8xwb5" event={"ID":"78ef8aa8-69b1-43ae-844a-9b3bed415a4a","Type":"ContainerStarted","Data":"660d5408729bce344937849fbd6b92a79756ac40474ff20d7473da33d974d8b9"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.953808 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:16 crc kubenswrapper[4658]: E1002 11:21:16.954142 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.454127717 +0000 UTC m=+158.345281284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.963417 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" podStartSLOduration=134.963400556 podStartE2EDuration="2m14.963400556s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.902184916 +0000 UTC m=+157.793338493" watchObservedRunningTime="2025-10-02 11:21:16.963400556 +0000 UTC m=+157.854554123" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.964182 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fx7w" podStartSLOduration=135.964178213 podStartE2EDuration="2m15.964178213s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.961708458 +0000 UTC m=+157.852862025" watchObservedRunningTime="2025-10-02 11:21:16.964178213 +0000 UTC m=+157.855331780" Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.975447 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" event={"ID":"a87e683d-1f76-40b8-bfeb-b06076224893","Type":"ContainerStarted","Data":"13e345ea11b5c86299d6e8136a8046c6654c8cebb7634e84f52a89f1ed01e117"} Oct 02 11:21:16 crc kubenswrapper[4658]: I1002 11:21:16.996027 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.000925 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.018680 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" podStartSLOduration=135.018655661 podStartE2EDuration="2m15.018655661s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:16.995491406 +0000 UTC m=+157.886644973" watchObservedRunningTime="2025-10-02 11:21:17.018655661 +0000 UTC m=+157.909809228" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.019137 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" podStartSLOduration=136.019131407 podStartE2EDuration="2m16.019131407s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:17.015946988 +0000 UTC m=+157.907100555" watchObservedRunningTime="2025-10-02 11:21:17.019131407 +0000 UTC m=+157.910284974" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.056016 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.058496 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.558457766 +0000 UTC m=+158.449611333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.073483 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8xwb5" podStartSLOduration=7.073461091 podStartE2EDuration="7.073461091s" podCreationTimestamp="2025-10-02 11:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:17.072181067 +0000 UTC m=+157.963334634" watchObservedRunningTime="2025-10-02 11:21:17.073461091 +0000 UTC m=+157.964614658" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.100558 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l6vlk" podStartSLOduration=7.100541459 podStartE2EDuration="7.100541459s" podCreationTimestamp="2025-10-02 11:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:17.098031784 +0000 UTC m=+157.989185351" watchObservedRunningTime="2025-10-02 11:21:17.100541459 +0000 UTC m=+157.991695026" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.133525 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d5zmz" podStartSLOduration=135.13351002 podStartE2EDuration="2m15.13351002s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:17.132786015 +0000 UTC m=+158.023939582" watchObservedRunningTime="2025-10-02 11:21:17.13351002 +0000 UTC m=+158.024663587" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.160900 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.161246 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.661231881 +0000 UTC m=+158.552385448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.261864 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.262367 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.762287927 +0000 UTC m=+158.653441494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.365518 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.366048 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.866033176 +0000 UTC m=+158.757186743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.466765 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.467105 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:17.967085842 +0000 UTC m=+158.858239409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.568419 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.568797 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.06878217 +0000 UTC m=+158.959935737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.597236 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:21:17 crc kubenswrapper[4658]: [-]has-synced failed: reason withheld Oct 02 11:21:17 crc kubenswrapper[4658]: [+]process-running ok Oct 02 11:21:17 crc kubenswrapper[4658]: healthz check failed Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.597319 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.669834 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.670319 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.170284141 +0000 UTC m=+159.061437708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.771031 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.771528 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.271509473 +0000 UTC m=+159.162663030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.872647 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.873093 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.373073877 +0000 UTC m=+159.264227444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.973957 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:17 crc kubenswrapper[4658]: E1002 11:21:17.974285 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.474267858 +0000 UTC m=+159.365421445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.975794 4658 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wvclq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.975880 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" podUID="fecb5f70-edd2-466b-a31f-25b1db79aec5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.981389 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5lrcf" event={"ID":"40342360-13dd-4953-805b-354528d0879d","Type":"ContainerStarted","Data":"4a093acfd5ff4ede9cba4b909bbe68919761ae51648623559653d7d0ecfc072b"} Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.982754 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" event={"ID":"45c25ebf-9993-4c4d-843b-5084afce8cfa","Type":"ContainerStarted","Data":"a6a8defc683c5b89f8a75dff809e818710045e51bbce930c5294f2d13fb75448"} Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.985028 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" event={"ID":"02bfe179-0d4b-4cd6-b2d0-b3aeaf023f5c","Type":"ContainerStarted","Data":"4d450f92dacc5bc5050cbd6ff70c8cde5add5b15d14afb07e7f60087e00dfa90"} Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.987761 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" event={"ID":"02ca41da-9f6c-4432-9041-32e3aeba0e92","Type":"ContainerStarted","Data":"12167c0763d90ce871bc2e9d2feb68303258a6d4e1558d2e21089f108f74b07c"} Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.988186 4658 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-q5bt5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.988234 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" podUID="571e8f9f-9662-4139-9cf5-51093519d329" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Oct 02 11:21:17 crc kubenswrapper[4658]: I1002 11:21:17.997749 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b856h" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.010659 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qlp92" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.012622 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgmnk" podStartSLOduration=136.012611603 podStartE2EDuration="2m16.012611603s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:18.010830872 +0000 UTC m=+158.901984439" watchObservedRunningTime="2025-10-02 11:21:18.012611603 +0000 UTC m=+158.903765170" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.040636 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" podStartSLOduration=136.040616664 podStartE2EDuration="2m16.040616664s" podCreationTimestamp="2025-10-02 11:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:18.035376634 +0000 UTC m=+158.926530201" watchObservedRunningTime="2025-10-02 11:21:18.040616664 +0000 UTC m=+158.931770231" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.075498 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.075647 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.575625905 +0000 UTC m=+159.466779472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.075944 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.081859 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.581843047 +0000 UTC m=+159.472996614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.178015 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.178465 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.678444501 +0000 UTC m=+159.569598068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.195976 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.280214 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.280549 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.780535122 +0000 UTC m=+159.671688689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.318804 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.318997 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.321027 4658 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-h6x29 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.321078 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" podUID="02ca41da-9f6c-4432-9041-32e3aeba0e92" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.381220 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.381608 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.881587409 +0000 UTC m=+159.772740976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.412914 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.482537 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.482859 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:18.982843712 +0000 UTC m=+159.873997279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.583907 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.584285 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.084266801 +0000 UTC m=+159.975420368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.595930 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:21:18 crc kubenswrapper[4658]: [-]has-synced failed: reason withheld Oct 02 11:21:18 crc kubenswrapper[4658]: [+]process-running ok Oct 02 11:21:18 crc kubenswrapper[4658]: healthz check failed Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.595989 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.684907 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.685385 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.185352288 +0000 UTC m=+160.076505855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.786324 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.786651 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.286619192 +0000 UTC m=+160.177772749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.786777 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.787096 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.287082228 +0000 UTC m=+160.178235795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.812643 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrdvc" Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.888427 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.888624 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.388591539 +0000 UTC m=+160.279745106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.888834 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.889158 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.389150748 +0000 UTC m=+160.280304315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.945615 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cmjlm"] Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.989776 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.989917 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.489894234 +0000 UTC m=+160.381047801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.989970 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:18 crc kubenswrapper[4658]: E1002 11:21:18.990310 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.490287097 +0000 UTC m=+160.381440664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.993612 4658 generic.go:334] "Generic (PLEG): container finished" podID="c7d11297-76f5-4bdd-a744-57ad6376de77" containerID="e5ba68c0abb79fafa459abd278582b539aeb645727f24119389cb207a4c149cd" exitCode=0 Oct 02 11:21:18 crc kubenswrapper[4658]: I1002 11:21:18.993770 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" event={"ID":"c7d11297-76f5-4bdd-a744-57ad6376de77","Type":"ContainerDied","Data":"e5ba68c0abb79fafa459abd278582b539aeb645727f24119389cb207a4c149cd"} Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.072288 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wchxv"] Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.073606 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.079867 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.090907 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.091015 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.590994552 +0000 UTC m=+160.482148119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.093205 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.097934 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.597915478 +0000 UTC m=+160.489069045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.108156 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wchxv"] Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.194979 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.195179 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.695146294 +0000 UTC m=+160.586299871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.195287 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-catalog-content\") pod \"certified-operators-wchxv\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.195450 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.195524 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66zsq\" (UniqueName: \"kubernetes.io/projected/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-kube-api-access-66zsq\") pod \"certified-operators-wchxv\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.195621 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-utilities\") pod \"certified-operators-wchxv\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.195983 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.695972762 +0000 UTC m=+160.587126329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.253914 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qzshr"] Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.254904 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.257106 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.293004 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qzshr"] Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.296432 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.296601 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-utilities\") pod \"certified-operators-wchxv\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.296652 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-catalog-content\") pod \"certified-operators-wchxv\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.296718 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66zsq\" (UniqueName: \"kubernetes.io/projected/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-kube-api-access-66zsq\") pod \"certified-operators-wchxv\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.297074 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.797055739 +0000 UTC m=+160.688209306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.297445 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-utilities\") pod \"certified-operators-wchxv\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.297649 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-catalog-content\") pod \"certified-operators-wchxv\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.351534 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66zsq\" (UniqueName: \"kubernetes.io/projected/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-kube-api-access-66zsq\") pod \"certified-operators-wchxv\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.388866 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.397732 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-utilities\") pod \"community-operators-qzshr\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.397779 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7xx4\" (UniqueName: \"kubernetes.io/projected/ed79e082-6f4d-418d-bf20-621fb495976a-kube-api-access-z7xx4\") pod \"community-operators-qzshr\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.397819 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-catalog-content\") pod \"community-operators-qzshr\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.398013 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.398505 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:19.898479718 +0000 UTC m=+160.789633295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.465286 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6j7kz"] Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.466473 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.486915 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6j7kz"] Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.499553 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.499780 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7xx4\" (UniqueName: \"kubernetes.io/projected/ed79e082-6f4d-418d-bf20-621fb495976a-kube-api-access-z7xx4\") pod \"community-operators-qzshr\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.499854 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-catalog-content\") pod \"community-operators-qzshr\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.499979 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-utilities\") pod \"community-operators-qzshr\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.500654 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-utilities\") pod \"community-operators-qzshr\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.500760 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.000733886 +0000 UTC m=+160.891887453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.501386 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-catalog-content\") pod \"community-operators-qzshr\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.535126 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7xx4\" (UniqueName: \"kubernetes.io/projected/ed79e082-6f4d-418d-bf20-621fb495976a-kube-api-access-z7xx4\") pod \"community-operators-qzshr\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.580614 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.601804 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m77dc\" (UniqueName: \"kubernetes.io/projected/b3286b84-35b9-4116-b8e9-e84fb5f50d23-kube-api-access-m77dc\") pod \"certified-operators-6j7kz\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.601873 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-utilities\") pod \"certified-operators-6j7kz\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.601901 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-catalog-content\") pod \"certified-operators-6j7kz\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.601932 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.602250 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.102234107 +0000 UTC m=+160.993387674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.610657 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:21:19 crc kubenswrapper[4658]: [-]has-synced failed: reason withheld Oct 02 11:21:19 crc kubenswrapper[4658]: [+]process-running ok Oct 02 11:21:19 crc kubenswrapper[4658]: healthz check failed Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.610730 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.666483 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nn4mk"] Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.667610 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.683167 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nn4mk"] Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.703166 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.703379 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.203348755 +0000 UTC m=+161.094502322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.703426 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m77dc\" (UniqueName: \"kubernetes.io/projected/b3286b84-35b9-4116-b8e9-e84fb5f50d23-kube-api-access-m77dc\") pod \"certified-operators-6j7kz\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.703598 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-utilities\") pod \"certified-operators-6j7kz\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.703629 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-catalog-content\") pod \"certified-operators-6j7kz\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.703673 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.704105 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.204098201 +0000 UTC m=+161.095251768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.704995 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-utilities\") pod \"certified-operators-6j7kz\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.705255 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-catalog-content\") pod \"certified-operators-6j7kz\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.748222 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m77dc\" (UniqueName: \"kubernetes.io/projected/b3286b84-35b9-4116-b8e9-e84fb5f50d23-kube-api-access-m77dc\") pod \"certified-operators-6j7kz\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.813804 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.814937 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.815178 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.315141689 +0000 UTC m=+161.206295256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.815377 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.815441 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4wj\" (UniqueName: \"kubernetes.io/projected/5e4d275a-1db6-471e-87f3-162c144e7586-kube-api-access-2m4wj\") pod \"community-operators-nn4mk\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.815508 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-catalog-content\") pod \"community-operators-nn4mk\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.815574 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-utilities\") pod \"community-operators-nn4mk\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.815964 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.315957148 +0000 UTC m=+161.207110715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.925650 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.925864 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.425832507 +0000 UTC m=+161.316986074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.926155 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.926213 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4wj\" (UniqueName: \"kubernetes.io/projected/5e4d275a-1db6-471e-87f3-162c144e7586-kube-api-access-2m4wj\") pod \"community-operators-nn4mk\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.926277 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-catalog-content\") pod \"community-operators-nn4mk\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.926363 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-utilities\") pod \"community-operators-nn4mk\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.926828 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-utilities\") pod \"community-operators-nn4mk\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: E1002 11:21:19.927105 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.427091959 +0000 UTC m=+161.318245526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.929797 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-catalog-content\") pod \"community-operators-nn4mk\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.960884 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4wj\" (UniqueName: \"kubernetes.io/projected/5e4d275a-1db6-471e-87f3-162c144e7586-kube-api-access-2m4wj\") pod \"community-operators-nn4mk\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.987336 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wchxv"] Oct 02 11:21:19 crc kubenswrapper[4658]: I1002 11:21:19.992600 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.009782 4658 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.027515 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:20 crc kubenswrapper[4658]: E1002 11:21:20.027770 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.527742842 +0000 UTC m=+161.418896399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.027896 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:20 crc kubenswrapper[4658]: E1002 11:21:20.028244 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.528235999 +0000 UTC m=+161.419389566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.045188 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" event={"ID":"45c25ebf-9993-4c4d-843b-5084afce8cfa","Type":"ContainerStarted","Data":"d8d2f33ab7e2979568edd13dc0b824be2e489a3caeeaa71d2aa25fc747fc5765"} Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.045271 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" event={"ID":"45c25ebf-9993-4c4d-843b-5084afce8cfa","Type":"ContainerStarted","Data":"d8e0abdc38c25accfa2953610685d9b45d9735e9aff9ea0dbecfa82065151f90"} Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.045399 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" podUID="60f101b6-dee6-41af-8943-cd8ebfd1d528" containerName="controller-manager" containerID="cri-o://b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3" gracePeriod=30 Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.130166 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:20 crc kubenswrapper[4658]: E1002 11:21:20.131323 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.631256492 +0000 UTC m=+161.522410059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.231264 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qzshr"] Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.234775 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:20 crc kubenswrapper[4658]: E1002 11:21:20.235223 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.735208918 +0000 UTC m=+161.626362485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:20 crc kubenswrapper[4658]: W1002 11:21:20.285508 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded79e082_6f4d_418d_bf20_621fb495976a.slice/crio-b851029894a8f17d7b4e1a125750c5e12948e8e4322790dd257a6ce8c039e29a WatchSource:0}: Error finding container b851029894a8f17d7b4e1a125750c5e12948e8e4322790dd257a6ce8c039e29a: Status 404 returned error can't find the container with id b851029894a8f17d7b4e1a125750c5e12948e8e4322790dd257a6ce8c039e29a Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.311659 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6j7kz"] Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.336154 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:20 crc kubenswrapper[4658]: E1002 11:21:20.336560 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.836526123 +0000 UTC m=+161.727679690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:20 crc kubenswrapper[4658]: W1002 11:21:20.347637 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3286b84_35b9_4116_b8e9_e84fb5f50d23.slice/crio-18903b8742ba185baa54a1109ba925a27ac90738823992ff5c368e9514e4148d WatchSource:0}: Error finding container 18903b8742ba185baa54a1109ba925a27ac90738823992ff5c368e9514e4148d: Status 404 returned error can't find the container with id 18903b8742ba185baa54a1109ba925a27ac90738823992ff5c368e9514e4148d Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.434284 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.442435 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:20 crc kubenswrapper[4658]: E1002 11:21:20.442798 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:21:20.942783138 +0000 UTC m=+161.833936705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-swmbd" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.450217 4658 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-02T11:21:20.009806727Z","Handler":null,"Name":""} Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.462540 4658 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.462584 4658 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.544068 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7d11297-76f5-4bdd-a744-57ad6376de77-config-volume\") pod \"c7d11297-76f5-4bdd-a744-57ad6376de77\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.544129 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7d11297-76f5-4bdd-a744-57ad6376de77-secret-volume\") pod \"c7d11297-76f5-4bdd-a744-57ad6376de77\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.544209 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62mk4\" (UniqueName: \"kubernetes.io/projected/c7d11297-76f5-4bdd-a744-57ad6376de77-kube-api-access-62mk4\") pod \"c7d11297-76f5-4bdd-a744-57ad6376de77\" (UID: \"c7d11297-76f5-4bdd-a744-57ad6376de77\") " Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.544463 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.544991 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d11297-76f5-4bdd-a744-57ad6376de77-config-volume" (OuterVolumeSpecName: "config-volume") pod "c7d11297-76f5-4bdd-a744-57ad6376de77" (UID: "c7d11297-76f5-4bdd-a744-57ad6376de77"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.551804 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d11297-76f5-4bdd-a744-57ad6376de77-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c7d11297-76f5-4bdd-a744-57ad6376de77" (UID: "c7d11297-76f5-4bdd-a744-57ad6376de77"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.552404 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d11297-76f5-4bdd-a744-57ad6376de77-kube-api-access-62mk4" (OuterVolumeSpecName: "kube-api-access-62mk4") pod "c7d11297-76f5-4bdd-a744-57ad6376de77" (UID: "c7d11297-76f5-4bdd-a744-57ad6376de77"). InnerVolumeSpecName "kube-api-access-62mk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.556552 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.597609 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:21:20 crc kubenswrapper[4658]: [-]has-synced failed: reason withheld Oct 02 11:21:20 crc kubenswrapper[4658]: [+]process-running ok Oct 02 11:21:20 crc kubenswrapper[4658]: healthz check failed Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.597691 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.616591 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nn4mk"] Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.646662 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.646908 4658 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7d11297-76f5-4bdd-a744-57ad6376de77-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.647000 4658 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7d11297-76f5-4bdd-a744-57ad6376de77-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.647094 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62mk4\" (UniqueName: \"kubernetes.io/projected/c7d11297-76f5-4bdd-a744-57ad6376de77-kube-api-access-62mk4\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.649141 4658 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.649183 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.668851 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-swmbd\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:20 crc kubenswrapper[4658]: W1002 11:21:20.677816 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e4d275a_1db6_471e_87f3_162c144e7586.slice/crio-596444e2df090e358deb8a9806c53b3dee1fdd7fecd9800dbccef1d857961d40 WatchSource:0}: Error finding container 596444e2df090e358deb8a9806c53b3dee1fdd7fecd9800dbccef1d857961d40: Status 404 returned error can't find the container with id 596444e2df090e358deb8a9806c53b3dee1fdd7fecd9800dbccef1d857961d40 Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.687847 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.849861 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-proxy-ca-bundles\") pod \"60f101b6-dee6-41af-8943-cd8ebfd1d528\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.850176 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqvx8\" (UniqueName: \"kubernetes.io/projected/60f101b6-dee6-41af-8943-cd8ebfd1d528-kube-api-access-kqvx8\") pod \"60f101b6-dee6-41af-8943-cd8ebfd1d528\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.850256 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-client-ca\") pod \"60f101b6-dee6-41af-8943-cd8ebfd1d528\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.850377 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-config\") pod \"60f101b6-dee6-41af-8943-cd8ebfd1d528\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.850469 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f101b6-dee6-41af-8943-cd8ebfd1d528-serving-cert\") pod \"60f101b6-dee6-41af-8943-cd8ebfd1d528\" (UID: \"60f101b6-dee6-41af-8943-cd8ebfd1d528\") " Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.850595 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "60f101b6-dee6-41af-8943-cd8ebfd1d528" (UID: "60f101b6-dee6-41af-8943-cd8ebfd1d528"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.850906 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-client-ca" (OuterVolumeSpecName: "client-ca") pod "60f101b6-dee6-41af-8943-cd8ebfd1d528" (UID: "60f101b6-dee6-41af-8943-cd8ebfd1d528"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.851135 4658 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.851885 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-config" (OuterVolumeSpecName: "config") pod "60f101b6-dee6-41af-8943-cd8ebfd1d528" (UID: "60f101b6-dee6-41af-8943-cd8ebfd1d528"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.854585 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f101b6-dee6-41af-8943-cd8ebfd1d528-kube-api-access-kqvx8" (OuterVolumeSpecName: "kube-api-access-kqvx8") pod "60f101b6-dee6-41af-8943-cd8ebfd1d528" (UID: "60f101b6-dee6-41af-8943-cd8ebfd1d528"). InnerVolumeSpecName "kube-api-access-kqvx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.855128 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f101b6-dee6-41af-8943-cd8ebfd1d528-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60f101b6-dee6-41af-8943-cd8ebfd1d528" (UID: "60f101b6-dee6-41af-8943-cd8ebfd1d528"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.932326 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.952792 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqvx8\" (UniqueName: \"kubernetes.io/projected/60f101b6-dee6-41af-8943-cd8ebfd1d528-kube-api-access-kqvx8\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.953011 4658 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.953418 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f101b6-dee6-41af-8943-cd8ebfd1d528-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:20 crc kubenswrapper[4658]: I1002 11:21:20.953656 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f101b6-dee6-41af-8943-cd8ebfd1d528-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.058245 4658 generic.go:334] "Generic (PLEG): container finished" podID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerID="aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc" exitCode=0 Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.058455 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wchxv" event={"ID":"578b83fe-55ef-4dc7-8df1-d1e2fce37db8","Type":"ContainerDied","Data":"aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.058613 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wchxv" event={"ID":"578b83fe-55ef-4dc7-8df1-d1e2fce37db8","Type":"ContainerStarted","Data":"03f90ea38b8cc9b6ff583652490b27c345baf08c30c46c21a8d1b6bfa1a73ae9"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.059825 4658 generic.go:334] "Generic (PLEG): container finished" podID="5e4d275a-1db6-471e-87f3-162c144e7586" containerID="14c65e3b2a5a75810b7a8dc0f3fc6449f864e3fb2ad89a346dbec261855007be" exitCode=0 Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.059870 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nn4mk" event={"ID":"5e4d275a-1db6-471e-87f3-162c144e7586","Type":"ContainerDied","Data":"14c65e3b2a5a75810b7a8dc0f3fc6449f864e3fb2ad89a346dbec261855007be"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.059888 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nn4mk" event={"ID":"5e4d275a-1db6-471e-87f3-162c144e7586","Type":"ContainerStarted","Data":"596444e2df090e358deb8a9806c53b3dee1fdd7fecd9800dbccef1d857961d40"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.060748 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.068746 4658 generic.go:334] "Generic (PLEG): container finished" podID="60f101b6-dee6-41af-8943-cd8ebfd1d528" containerID="b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3" exitCode=0 Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.068906 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" event={"ID":"60f101b6-dee6-41af-8943-cd8ebfd1d528","Type":"ContainerDied","Data":"b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.068970 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" event={"ID":"60f101b6-dee6-41af-8943-cd8ebfd1d528","Type":"ContainerDied","Data":"7ff2337159a796d9c427a9407ff46fc6f0b2618b36507fbc2846de3cd6163c4e"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.068995 4658 scope.go:117] "RemoveContainer" containerID="b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.069220 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cmjlm" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.080000 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.080035 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z" event={"ID":"c7d11297-76f5-4bdd-a744-57ad6376de77","Type":"ContainerDied","Data":"4f76ac8b12a37a64db991f736345aa34b1bebe5591d13173c4b52c1ee5047dc9"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.080095 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f76ac8b12a37a64db991f736345aa34b1bebe5591d13173c4b52c1ee5047dc9" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.100203 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" event={"ID":"45c25ebf-9993-4c4d-843b-5084afce8cfa","Type":"ContainerStarted","Data":"00406e317f222d983c57bef5884c79ad0223961d3c5e1ffc8b503e33c65dbeff"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.100263 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-swmbd"] Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.101923 4658 generic.go:334] "Generic (PLEG): container finished" podID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerID="1f16b6471eecdc81738169c95b417626e7ea0a29df92740cf7bde2b790cca795" exitCode=0 Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.101980 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j7kz" event={"ID":"b3286b84-35b9-4116-b8e9-e84fb5f50d23","Type":"ContainerDied","Data":"1f16b6471eecdc81738169c95b417626e7ea0a29df92740cf7bde2b790cca795"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.102002 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j7kz" event={"ID":"b3286b84-35b9-4116-b8e9-e84fb5f50d23","Type":"ContainerStarted","Data":"18903b8742ba185baa54a1109ba925a27ac90738823992ff5c368e9514e4148d"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.103397 4658 generic.go:334] "Generic (PLEG): container finished" podID="ed79e082-6f4d-418d-bf20-621fb495976a" containerID="1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741" exitCode=0 Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.103428 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzshr" event={"ID":"ed79e082-6f4d-418d-bf20-621fb495976a","Type":"ContainerDied","Data":"1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.103446 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzshr" event={"ID":"ed79e082-6f4d-418d-bf20-621fb495976a","Type":"ContainerStarted","Data":"b851029894a8f17d7b4e1a125750c5e12948e8e4322790dd257a6ce8c039e29a"} Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.104881 4658 scope.go:117] "RemoveContainer" containerID="b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3" Oct 02 11:21:21 crc kubenswrapper[4658]: E1002 11:21:21.105837 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3\": container with ID starting with b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3 not found: ID does not exist" containerID="b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.105939 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3"} err="failed to get container status \"b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3\": rpc error: code = NotFound desc = could not find container \"b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3\": container with ID starting with b0d8b9e6ecdfdfda283039894f45b23f05180b9464c69de6c78464c3fc8f77a3 not found: ID does not exist" Oct 02 11:21:21 crc kubenswrapper[4658]: W1002 11:21:21.116705 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b99dd62_8d35_4423_a53a_da7654a17fb7.slice/crio-bec11d2ac06f6f43e3828de35ec55e1330d6571e922379f90e1da1b3a83c5c37 WatchSource:0}: Error finding container bec11d2ac06f6f43e3828de35ec55e1330d6571e922379f90e1da1b3a83c5c37: Status 404 returned error can't find the container with id bec11d2ac06f6f43e3828de35ec55e1330d6571e922379f90e1da1b3a83c5c37 Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.117033 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cmjlm"] Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.119616 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cmjlm"] Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.130815 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kbq7v" podStartSLOduration=11.130792346 podStartE2EDuration="11.130792346s" podCreationTimestamp="2025-10-02 11:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:21.127127851 +0000 UTC m=+162.018281448" watchObservedRunningTime="2025-10-02 11:21:21.130792346 +0000 UTC m=+162.021945933" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.232983 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tjkgd"] Oct 02 11:21:21 crc kubenswrapper[4658]: E1002 11:21:21.233188 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f101b6-dee6-41af-8943-cd8ebfd1d528" containerName="controller-manager" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.233201 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f101b6-dee6-41af-8943-cd8ebfd1d528" containerName="controller-manager" Oct 02 11:21:21 crc kubenswrapper[4658]: E1002 11:21:21.233213 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d11297-76f5-4bdd-a744-57ad6376de77" containerName="collect-profiles" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.233219 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d11297-76f5-4bdd-a744-57ad6376de77" containerName="collect-profiles" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.233337 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d11297-76f5-4bdd-a744-57ad6376de77" containerName="collect-profiles" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.233352 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f101b6-dee6-41af-8943-cd8ebfd1d528" containerName="controller-manager" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.234095 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.237945 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.246827 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjkgd"] Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.360244 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-catalog-content\") pod \"redhat-marketplace-tjkgd\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.360694 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-utilities\") pod \"redhat-marketplace-tjkgd\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.360723 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mb44\" (UniqueName: \"kubernetes.io/projected/45634610-7bec-413b-8b11-3b90a851b37b-kube-api-access-2mb44\") pod \"redhat-marketplace-tjkgd\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.461717 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-utilities\") pod \"redhat-marketplace-tjkgd\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.461771 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mb44\" (UniqueName: \"kubernetes.io/projected/45634610-7bec-413b-8b11-3b90a851b37b-kube-api-access-2mb44\") pod \"redhat-marketplace-tjkgd\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.461861 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-catalog-content\") pod \"redhat-marketplace-tjkgd\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.462325 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-catalog-content\") pod \"redhat-marketplace-tjkgd\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.462640 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-utilities\") pod \"redhat-marketplace-tjkgd\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.492447 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mb44\" (UniqueName: \"kubernetes.io/projected/45634610-7bec-413b-8b11-3b90a851b37b-kube-api-access-2mb44\") pod \"redhat-marketplace-tjkgd\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.556492 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.596742 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:21:21 crc kubenswrapper[4658]: [-]has-synced failed: reason withheld Oct 02 11:21:21 crc kubenswrapper[4658]: [+]process-running ok Oct 02 11:21:21 crc kubenswrapper[4658]: healthz check failed Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.596820 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.643100 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lrxpp"] Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.645701 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.664522 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrxpp"] Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.668038 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.676133 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xbkft" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.700966 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2chjq"] Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.701856 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.705370 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.705619 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.705793 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.706031 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.706350 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.706684 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.712317 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.716560 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2chjq"] Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.766955 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrhc8\" (UniqueName: \"kubernetes.io/projected/10d795d1-5e35-42da-9cd9-9761be302b1b-kube-api-access-hrhc8\") pod \"redhat-marketplace-lrxpp\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.767052 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-utilities\") pod \"redhat-marketplace-lrxpp\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.767222 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-catalog-content\") pod \"redhat-marketplace-lrxpp\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.868235 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-client-ca\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.868402 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5jhk\" (UniqueName: \"kubernetes.io/projected/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-kube-api-access-m5jhk\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.868445 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrhc8\" (UniqueName: \"kubernetes.io/projected/10d795d1-5e35-42da-9cd9-9761be302b1b-kube-api-access-hrhc8\") pod \"redhat-marketplace-lrxpp\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.868573 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-config\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.868652 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-utilities\") pod \"redhat-marketplace-lrxpp\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.868770 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-serving-cert\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.868919 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.868995 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-catalog-content\") pod \"redhat-marketplace-lrxpp\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.871106 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-utilities\") pod \"redhat-marketplace-lrxpp\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.871534 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-catalog-content\") pod \"redhat-marketplace-lrxpp\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.893224 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrhc8\" (UniqueName: \"kubernetes.io/projected/10d795d1-5e35-42da-9cd9-9761be302b1b-kube-api-access-hrhc8\") pod \"redhat-marketplace-lrxpp\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.966185 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f101b6-dee6-41af-8943-cd8ebfd1d528" path="/var/lib/kubelet/pods/60f101b6-dee6-41af-8943-cd8ebfd1d528/volumes" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.967590 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.969777 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.969826 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-client-ca\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.969869 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5jhk\" (UniqueName: \"kubernetes.io/projected/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-kube-api-access-m5jhk\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.969893 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-config\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.969932 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-serving-cert\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.971259 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.971431 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.972426 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-config\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.975488 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-client-ca\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.976340 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-serving-cert\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.984452 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjkgd"] Oct 02 11:21:21 crc kubenswrapper[4658]: I1002 11:21:21.990554 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5jhk\" (UniqueName: \"kubernetes.io/projected/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-kube-api-access-m5jhk\") pod \"controller-manager-879f6c89f-2chjq\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:22 crc kubenswrapper[4658]: W1002 11:21:22.002221 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45634610_7bec_413b_8b11_3b90a851b37b.slice/crio-79d41d4f7abd2b7392336092511933b6dec3ec6ab9920e0436134b7c037bae3c WatchSource:0}: Error finding container 79d41d4f7abd2b7392336092511933b6dec3ec6ab9920e0436134b7c037bae3c: Status 404 returned error can't find the container with id 79d41d4f7abd2b7392336092511933b6dec3ec6ab9920e0436134b7c037bae3c Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.022767 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.129775 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" event={"ID":"8b99dd62-8d35-4423-a53a-da7654a17fb7","Type":"ContainerStarted","Data":"420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651"} Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.130182 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.130198 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" event={"ID":"8b99dd62-8d35-4423-a53a-da7654a17fb7","Type":"ContainerStarted","Data":"bec11d2ac06f6f43e3828de35ec55e1330d6571e922379f90e1da1b3a83c5c37"} Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.135499 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjkgd" event={"ID":"45634610-7bec-413b-8b11-3b90a851b37b","Type":"ContainerStarted","Data":"79d41d4f7abd2b7392336092511933b6dec3ec6ab9920e0436134b7c037bae3c"} Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.158323 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" podStartSLOduration=141.158283009 podStartE2EDuration="2m21.158283009s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:22.153050069 +0000 UTC m=+163.044203646" watchObservedRunningTime="2025-10-02 11:21:22.158283009 +0000 UTC m=+163.049436576" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.237375 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fzf5c"] Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.239272 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.244546 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.247414 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzf5c"] Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.321733 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.322439 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.325517 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.326362 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.337461 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.360888 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrxpp"] Oct 02 11:21:22 crc kubenswrapper[4658]: W1002 11:21:22.373558 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d795d1_5e35_42da_9cd9_9761be302b1b.slice/crio-ecbdad5a576180f5f76f8f9a67814930ddbfbf9f66135be8be1d6549eb547fbc WatchSource:0}: Error finding container ecbdad5a576180f5f76f8f9a67814930ddbfbf9f66135be8be1d6549eb547fbc: Status 404 returned error can't find the container with id ecbdad5a576180f5f76f8f9a67814930ddbfbf9f66135be8be1d6549eb547fbc Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.379811 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-utilities\") pod \"redhat-operators-fzf5c\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.379927 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wllx\" (UniqueName: \"kubernetes.io/projected/cc5605be-988f-43bc-b3e1-4d7346ef81cf-kube-api-access-8wllx\") pod \"redhat-operators-fzf5c\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.379984 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-catalog-content\") pod \"redhat-operators-fzf5c\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: W1002 11:21:22.421055 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode75513c8_260e_4da7_8a62_ef8cd4dc52f4.slice/crio-32ba0976ef03e44804dd0d355481f615aba3107c4b9788ce8ec71c707e4d93ed WatchSource:0}: Error finding container 32ba0976ef03e44804dd0d355481f615aba3107c4b9788ce8ec71c707e4d93ed: Status 404 returned error can't find the container with id 32ba0976ef03e44804dd0d355481f615aba3107c4b9788ce8ec71c707e4d93ed Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.424529 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2chjq"] Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.481310 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.481462 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.481493 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wllx\" (UniqueName: \"kubernetes.io/projected/cc5605be-988f-43bc-b3e1-4d7346ef81cf-kube-api-access-8wllx\") pod \"redhat-operators-fzf5c\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.481560 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-catalog-content\") pod \"redhat-operators-fzf5c\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.481592 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-utilities\") pod \"redhat-operators-fzf5c\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.482097 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-utilities\") pod \"redhat-operators-fzf5c\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.482740 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-catalog-content\") pod \"redhat-operators-fzf5c\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.502629 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wllx\" (UniqueName: \"kubernetes.io/projected/cc5605be-988f-43bc-b3e1-4d7346ef81cf-kube-api-access-8wllx\") pod \"redhat-operators-fzf5c\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.558940 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.583884 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.583995 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.584007 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.600543 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:21:22 crc kubenswrapper[4658]: [-]has-synced failed: reason withheld Oct 02 11:21:22 crc kubenswrapper[4658]: [+]process-running ok Oct 02 11:21:22 crc kubenswrapper[4658]: healthz check failed Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.600620 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.605566 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.658542 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lvfxq"] Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.661227 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.666643 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.695616 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvfxq"] Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.742289 4658 patch_prober.go:28] interesting pod/downloads-7954f5f757-w7rrv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.742384 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-w7rrv" podUID="c670b59a-b4ec-4332-9a76-72fee4666277" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.742778 4658 patch_prober.go:28] interesting pod/downloads-7954f5f757-w7rrv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.742846 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w7rrv" podUID="c670b59a-b4ec-4332-9a76-72fee4666277" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.789084 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-catalog-content\") pod \"redhat-operators-lvfxq\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.789323 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vmd7\" (UniqueName: \"kubernetes.io/projected/dc132895-415b-49fb-86b7-63c32990a0cd-kube-api-access-8vmd7\") pod \"redhat-operators-lvfxq\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.789384 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-utilities\") pod \"redhat-operators-lvfxq\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.826455 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.826505 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.828444 4658 patch_prober.go:28] interesting pod/console-f9d7485db-md7fr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.828496 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-md7fr" podUID="4082750e-cf12-45b4-8920-63f31ad1cc28" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.890225 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-catalog-content\") pod \"redhat-operators-lvfxq\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.890647 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vmd7\" (UniqueName: \"kubernetes.io/projected/dc132895-415b-49fb-86b7-63c32990a0cd-kube-api-access-8vmd7\") pod \"redhat-operators-lvfxq\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.890721 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-utilities\") pod \"redhat-operators-lvfxq\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.891393 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-utilities\") pod \"redhat-operators-lvfxq\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.891663 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-catalog-content\") pod \"redhat-operators-lvfxq\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:22 crc kubenswrapper[4658]: I1002 11:21:22.913547 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vmd7\" (UniqueName: \"kubernetes.io/projected/dc132895-415b-49fb-86b7-63c32990a0cd-kube-api-access-8vmd7\") pod \"redhat-operators-lvfxq\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.004388 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.063180 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 11:21:23 crc kubenswrapper[4658]: W1002 11:21:23.084421 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0c4d8047_922c_4594_8ba2_3624fc2e73c5.slice/crio-d116ecb091d4e9b7ed4b3033ba88564a452676f5108ac37211535850c6388d09 WatchSource:0}: Error finding container d116ecb091d4e9b7ed4b3033ba88564a452676f5108ac37211535850c6388d09: Status 404 returned error can't find the container with id d116ecb091d4e9b7ed4b3033ba88564a452676f5108ac37211535850c6388d09 Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.189308 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzf5c"] Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.193934 4658 generic.go:334] "Generic (PLEG): container finished" podID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerID="f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872" exitCode=0 Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.194020 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrxpp" event={"ID":"10d795d1-5e35-42da-9cd9-9761be302b1b","Type":"ContainerDied","Data":"f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872"} Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.194051 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrxpp" event={"ID":"10d795d1-5e35-42da-9cd9-9761be302b1b","Type":"ContainerStarted","Data":"ecbdad5a576180f5f76f8f9a67814930ddbfbf9f66135be8be1d6549eb547fbc"} Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.200189 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0c4d8047-922c-4594-8ba2-3624fc2e73c5","Type":"ContainerStarted","Data":"d116ecb091d4e9b7ed4b3033ba88564a452676f5108ac37211535850c6388d09"} Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.201789 4658 generic.go:334] "Generic (PLEG): container finished" podID="45634610-7bec-413b-8b11-3b90a851b37b" containerID="bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135" exitCode=0 Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.201834 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjkgd" event={"ID":"45634610-7bec-413b-8b11-3b90a851b37b","Type":"ContainerDied","Data":"bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135"} Oct 02 11:21:23 crc kubenswrapper[4658]: W1002 11:21:23.204940 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc5605be_988f_43bc_b3e1_4d7346ef81cf.slice/crio-ce9df9ce731de7ee46008067c08351ae97e88667035504a5c060aafbaaa3fa84 WatchSource:0}: Error finding container ce9df9ce731de7ee46008067c08351ae97e88667035504a5c060aafbaaa3fa84: Status 404 returned error can't find the container with id ce9df9ce731de7ee46008067c08351ae97e88667035504a5c060aafbaaa3fa84 Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.226786 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" event={"ID":"e75513c8-260e-4da7-8a62-ef8cd4dc52f4","Type":"ContainerStarted","Data":"f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15"} Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.226830 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" event={"ID":"e75513c8-260e-4da7-8a62-ef8cd4dc52f4","Type":"ContainerStarted","Data":"32ba0976ef03e44804dd0d355481f615aba3107c4b9788ce8ec71c707e4d93ed"} Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.226847 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.246982 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.288924 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" podStartSLOduration=4.288899979 podStartE2EDuration="4.288899979s" podCreationTimestamp="2025-10-02 11:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:23.277044583 +0000 UTC m=+164.168198160" watchObservedRunningTime="2025-10-02 11:21:23.288899979 +0000 UTC m=+164.180053546" Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.339716 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.358894 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h6x29" Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.561712 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvfxq"] Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.593955 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.598930 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:21:23 crc kubenswrapper[4658]: [-]has-synced failed: reason withheld Oct 02 11:21:23 crc kubenswrapper[4658]: [+]process-running ok Oct 02 11:21:23 crc kubenswrapper[4658]: healthz check failed Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.599000 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:23 crc kubenswrapper[4658]: I1002 11:21:23.853989 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.240366 4658 generic.go:334] "Generic (PLEG): container finished" podID="dc132895-415b-49fb-86b7-63c32990a0cd" containerID="d2419bb356296951a88c9869f743592dba35f614c4cf29507d4679cbafb61ca1" exitCode=0 Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.240796 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvfxq" event={"ID":"dc132895-415b-49fb-86b7-63c32990a0cd","Type":"ContainerDied","Data":"d2419bb356296951a88c9869f743592dba35f614c4cf29507d4679cbafb61ca1"} Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.240831 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvfxq" event={"ID":"dc132895-415b-49fb-86b7-63c32990a0cd","Type":"ContainerStarted","Data":"e2ee882b0c77b7fc537fd66de4f6e2f680c829d7b35b3f44a8cf1b700c421b8d"} Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.259810 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0c4d8047-922c-4594-8ba2-3624fc2e73c5","Type":"ContainerStarted","Data":"ce5d7db902a5fc2206fc85364beb309da8bbf0ac5c7340041390cc57bbf54d52"} Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.269350 4658 generic.go:334] "Generic (PLEG): container finished" podID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerID="f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7" exitCode=0 Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.270748 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzf5c" event={"ID":"cc5605be-988f-43bc-b3e1-4d7346ef81cf","Type":"ContainerDied","Data":"f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7"} Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.280498 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzf5c" event={"ID":"cc5605be-988f-43bc-b3e1-4d7346ef81cf","Type":"ContainerStarted","Data":"ce9df9ce731de7ee46008067c08351ae97e88667035504a5c060aafbaaa3fa84"} Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.293342 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.293325071 podStartE2EDuration="2.293325071s" podCreationTimestamp="2025-10-02 11:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:24.292125549 +0000 UTC m=+165.183279116" watchObservedRunningTime="2025-10-02 11:21:24.293325071 +0000 UTC m=+165.184478638" Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.452285 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.475504 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea83baf-570c-46db-ad98-aa9ec89d1c82-metrics-certs\") pod \"network-metrics-daemon-6fxls\" (UID: \"2ea83baf-570c-46db-ad98-aa9ec89d1c82\") " pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.598210 4658 patch_prober.go:28] interesting pod/router-default-5444994796-g4bk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:21:24 crc kubenswrapper[4658]: [-]has-synced failed: reason withheld Oct 02 11:21:24 crc kubenswrapper[4658]: [+]process-running ok Oct 02 11:21:24 crc kubenswrapper[4658]: healthz check failed Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.598280 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g4bk2" podUID="0912be1c-00d6-47fb-84fa-58b6569ea434" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:21:24 crc kubenswrapper[4658]: I1002 11:21:24.684397 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fxls" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.034815 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6fxls"] Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.281189 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.282088 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.286934 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.288748 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.297449 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6fxls" event={"ID":"2ea83baf-570c-46db-ad98-aa9ec89d1c82","Type":"ContainerStarted","Data":"48a0c8dd5072b88ed473e4d63cbd7a5dfc6b6437762c3f74d4d431bdb93d2cac"} Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.299895 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.309828 4658 generic.go:334] "Generic (PLEG): container finished" podID="0c4d8047-922c-4594-8ba2-3624fc2e73c5" containerID="ce5d7db902a5fc2206fc85364beb309da8bbf0ac5c7340041390cc57bbf54d52" exitCode=0 Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.310800 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0c4d8047-922c-4594-8ba2-3624fc2e73c5","Type":"ContainerDied","Data":"ce5d7db902a5fc2206fc85364beb309da8bbf0ac5c7340041390cc57bbf54d52"} Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.378484 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72f8f472-ba49-46d7-998d-627a4cf18df7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"72f8f472-ba49-46d7-998d-627a4cf18df7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.378570 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72f8f472-ba49-46d7-998d-627a4cf18df7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"72f8f472-ba49-46d7-998d-627a4cf18df7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.485005 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72f8f472-ba49-46d7-998d-627a4cf18df7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"72f8f472-ba49-46d7-998d-627a4cf18df7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.485905 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72f8f472-ba49-46d7-998d-627a4cf18df7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"72f8f472-ba49-46d7-998d-627a4cf18df7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.485172 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72f8f472-ba49-46d7-998d-627a4cf18df7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"72f8f472-ba49-46d7-998d-627a4cf18df7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.511236 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72f8f472-ba49-46d7-998d-627a4cf18df7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"72f8f472-ba49-46d7-998d-627a4cf18df7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.595577 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l6vlk" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.598575 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.604026 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-g4bk2" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.609206 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:25 crc kubenswrapper[4658]: I1002 11:21:25.980200 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 11:21:26 crc kubenswrapper[4658]: I1002 11:21:26.327203 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6fxls" event={"ID":"2ea83baf-570c-46db-ad98-aa9ec89d1c82","Type":"ContainerStarted","Data":"3d0014a1cf27c9a1f9626229a257095e224759220afb27f8ca6c98b1bb53f7cb"} Oct 02 11:21:26 crc kubenswrapper[4658]: I1002 11:21:26.336776 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"72f8f472-ba49-46d7-998d-627a4cf18df7","Type":"ContainerStarted","Data":"9e79ca82351558842ebfb5cde480662ca2e88f0cbd506af4ef0a36c55c54de81"} Oct 02 11:21:26 crc kubenswrapper[4658]: I1002 11:21:26.686768 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:26 crc kubenswrapper[4658]: I1002 11:21:26.827882 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kubelet-dir\") pod \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\" (UID: \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\") " Oct 02 11:21:26 crc kubenswrapper[4658]: I1002 11:21:26.828053 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kube-api-access\") pod \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\" (UID: \"0c4d8047-922c-4594-8ba2-3624fc2e73c5\") " Oct 02 11:21:26 crc kubenswrapper[4658]: I1002 11:21:26.828170 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0c4d8047-922c-4594-8ba2-3624fc2e73c5" (UID: "0c4d8047-922c-4594-8ba2-3624fc2e73c5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:21:26 crc kubenswrapper[4658]: I1002 11:21:26.828716 4658 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:26 crc kubenswrapper[4658]: I1002 11:21:26.837138 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0c4d8047-922c-4594-8ba2-3624fc2e73c5" (UID: "0c4d8047-922c-4594-8ba2-3624fc2e73c5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:21:26 crc kubenswrapper[4658]: I1002 11:21:26.929517 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c4d8047-922c-4594-8ba2-3624fc2e73c5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:27 crc kubenswrapper[4658]: I1002 11:21:27.354139 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6fxls" event={"ID":"2ea83baf-570c-46db-ad98-aa9ec89d1c82","Type":"ContainerStarted","Data":"a63d68857ac1f8ce07e58d331bf8c592e4791a8ae641a94a137bd0f619d2d28c"} Oct 02 11:21:27 crc kubenswrapper[4658]: I1002 11:21:27.363789 4658 generic.go:334] "Generic (PLEG): container finished" podID="72f8f472-ba49-46d7-998d-627a4cf18df7" containerID="c36978400a7360ba0875f20299f88212c8e6619503d5e6bd09148dc65209f1f8" exitCode=0 Oct 02 11:21:27 crc kubenswrapper[4658]: I1002 11:21:27.363880 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"72f8f472-ba49-46d7-998d-627a4cf18df7","Type":"ContainerDied","Data":"c36978400a7360ba0875f20299f88212c8e6619503d5e6bd09148dc65209f1f8"} Oct 02 11:21:27 crc kubenswrapper[4658]: I1002 11:21:27.367200 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6fxls" podStartSLOduration=146.367185524 podStartE2EDuration="2m26.367185524s" podCreationTimestamp="2025-10-02 11:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:21:27.365661551 +0000 UTC m=+168.256815128" watchObservedRunningTime="2025-10-02 11:21:27.367185524 +0000 UTC m=+168.258339091" Oct 02 11:21:27 crc kubenswrapper[4658]: I1002 11:21:27.374643 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0c4d8047-922c-4594-8ba2-3624fc2e73c5","Type":"ContainerDied","Data":"d116ecb091d4e9b7ed4b3033ba88564a452676f5108ac37211535850c6388d09"} Oct 02 11:21:27 crc kubenswrapper[4658]: I1002 11:21:27.374701 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d116ecb091d4e9b7ed4b3033ba88564a452676f5108ac37211535850c6388d09" Oct 02 11:21:27 crc kubenswrapper[4658]: I1002 11:21:27.374726 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:21:27 crc kubenswrapper[4658]: I1002 11:21:27.429557 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:21:27 crc kubenswrapper[4658]: I1002 11:21:27.429633 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:21:32 crc kubenswrapper[4658]: I1002 11:21:32.747606 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-w7rrv" Oct 02 11:21:32 crc kubenswrapper[4658]: I1002 11:21:32.866142 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:32 crc kubenswrapper[4658]: I1002 11:21:32.870270 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:21:35 crc kubenswrapper[4658]: I1002 11:21:35.904051 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:36 crc kubenswrapper[4658]: I1002 11:21:36.059765 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72f8f472-ba49-46d7-998d-627a4cf18df7-kube-api-access\") pod \"72f8f472-ba49-46d7-998d-627a4cf18df7\" (UID: \"72f8f472-ba49-46d7-998d-627a4cf18df7\") " Oct 02 11:21:36 crc kubenswrapper[4658]: I1002 11:21:36.059884 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72f8f472-ba49-46d7-998d-627a4cf18df7-kubelet-dir\") pod \"72f8f472-ba49-46d7-998d-627a4cf18df7\" (UID: \"72f8f472-ba49-46d7-998d-627a4cf18df7\") " Oct 02 11:21:36 crc kubenswrapper[4658]: I1002 11:21:36.060062 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72f8f472-ba49-46d7-998d-627a4cf18df7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "72f8f472-ba49-46d7-998d-627a4cf18df7" (UID: "72f8f472-ba49-46d7-998d-627a4cf18df7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:21:36 crc kubenswrapper[4658]: I1002 11:21:36.060442 4658 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72f8f472-ba49-46d7-998d-627a4cf18df7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:36 crc kubenswrapper[4658]: I1002 11:21:36.064308 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f8f472-ba49-46d7-998d-627a4cf18df7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "72f8f472-ba49-46d7-998d-627a4cf18df7" (UID: "72f8f472-ba49-46d7-998d-627a4cf18df7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:21:36 crc kubenswrapper[4658]: I1002 11:21:36.161620 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72f8f472-ba49-46d7-998d-627a4cf18df7-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:36 crc kubenswrapper[4658]: I1002 11:21:36.450056 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"72f8f472-ba49-46d7-998d-627a4cf18df7","Type":"ContainerDied","Data":"9e79ca82351558842ebfb5cde480662ca2e88f0cbd506af4ef0a36c55c54de81"} Oct 02 11:21:36 crc kubenswrapper[4658]: I1002 11:21:36.450125 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e79ca82351558842ebfb5cde480662ca2e88f0cbd506af4ef0a36c55c54de81" Oct 02 11:21:36 crc kubenswrapper[4658]: I1002 11:21:36.450253 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:21:40 crc kubenswrapper[4658]: I1002 11:21:40.939190 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:21:53 crc kubenswrapper[4658]: I1002 11:21:53.843041 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6jkm" Oct 02 11:21:57 crc kubenswrapper[4658]: I1002 11:21:57.429928 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:21:57 crc kubenswrapper[4658]: I1002 11:21:57.430019 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:22:03 crc kubenswrapper[4658]: E1002 11:22:03.381363 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 11:22:03 crc kubenswrapper[4658]: E1002 11:22:03.382028 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wllx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fzf5c_openshift-marketplace(cc5605be-988f-43bc-b3e1-4d7346ef81cf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:22:03 crc kubenswrapper[4658]: E1002 11:22:03.383210 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fzf5c" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" Oct 02 11:22:03 crc kubenswrapper[4658]: E1002 11:22:03.945278 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fzf5c" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" Oct 02 11:22:04 crc kubenswrapper[4658]: E1002 11:22:04.655420 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 11:22:04 crc kubenswrapper[4658]: E1002 11:22:04.655560 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z7xx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qzshr_openshift-marketplace(ed79e082-6f4d-418d-bf20-621fb495976a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:22:04 crc kubenswrapper[4658]: E1002 11:22:04.656779 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qzshr" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" Oct 02 11:22:08 crc kubenswrapper[4658]: E1002 11:22:08.615523 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qzshr" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.218253 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.218716 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m77dc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6j7kz_openshift-marketplace(b3286b84-35b9-4116-b8e9-e84fb5f50d23): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.220532 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6j7kz" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.296222 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.296368 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m4wj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nn4mk_openshift-marketplace(5e4d275a-1db6-471e-87f3-162c144e7586): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.297977 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nn4mk" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.331224 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.331379 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-66zsq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wchxv_openshift-marketplace(578b83fe-55ef-4dc7-8df1-d1e2fce37db8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.332548 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wchxv" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.390452 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.390603 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8vmd7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lvfxq_openshift-marketplace(dc132895-415b-49fb-86b7-63c32990a0cd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.391848 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lvfxq" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.597670 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.597804 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mb44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-tjkgd_openshift-marketplace(45634610-7bec-413b-8b11-3b90a851b37b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.599354 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-tjkgd" podUID="45634610-7bec-413b-8b11-3b90a851b37b" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.608729 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.608952 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hrhc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lrxpp_openshift-marketplace(10d795d1-5e35-42da-9cd9-9761be302b1b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.610219 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lrxpp" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.628586 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lvfxq" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.629088 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nn4mk" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.629654 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-tjkgd" podUID="45634610-7bec-413b-8b11-3b90a851b37b" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.629722 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wchxv" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" Oct 02 11:22:10 crc kubenswrapper[4658]: E1002 11:22:10.630035 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6j7kz" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" Oct 02 11:22:19 crc kubenswrapper[4658]: I1002 11:22:19.682569 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzf5c" event={"ID":"cc5605be-988f-43bc-b3e1-4d7346ef81cf","Type":"ContainerStarted","Data":"3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c"} Oct 02 11:22:20 crc kubenswrapper[4658]: I1002 11:22:20.693924 4658 generic.go:334] "Generic (PLEG): container finished" podID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerID="3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c" exitCode=0 Oct 02 11:22:20 crc kubenswrapper[4658]: I1002 11:22:20.694010 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzf5c" event={"ID":"cc5605be-988f-43bc-b3e1-4d7346ef81cf","Type":"ContainerDied","Data":"3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c"} Oct 02 11:22:20 crc kubenswrapper[4658]: I1002 11:22:20.699124 4658 generic.go:334] "Generic (PLEG): container finished" podID="ed79e082-6f4d-418d-bf20-621fb495976a" containerID="9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf" exitCode=0 Oct 02 11:22:20 crc kubenswrapper[4658]: I1002 11:22:20.699197 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzshr" event={"ID":"ed79e082-6f4d-418d-bf20-621fb495976a","Type":"ContainerDied","Data":"9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf"} Oct 02 11:22:21 crc kubenswrapper[4658]: I1002 11:22:21.706246 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzshr" event={"ID":"ed79e082-6f4d-418d-bf20-621fb495976a","Type":"ContainerStarted","Data":"d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c"} Oct 02 11:22:21 crc kubenswrapper[4658]: I1002 11:22:21.988146 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qzshr" podStartSLOduration=2.9827929380000002 podStartE2EDuration="1m2.988118744s" podCreationTimestamp="2025-10-02 11:21:19 +0000 UTC" firstStartedPulling="2025-10-02 11:21:21.105784528 +0000 UTC m=+161.996938095" lastFinishedPulling="2025-10-02 11:22:21.111110304 +0000 UTC m=+222.002263901" observedRunningTime="2025-10-02 11:22:21.722483973 +0000 UTC m=+222.613637560" watchObservedRunningTime="2025-10-02 11:22:21.988118744 +0000 UTC m=+222.879272351" Oct 02 11:22:22 crc kubenswrapper[4658]: I1002 11:22:22.716904 4658 generic.go:334] "Generic (PLEG): container finished" podID="45634610-7bec-413b-8b11-3b90a851b37b" containerID="a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200" exitCode=0 Oct 02 11:22:22 crc kubenswrapper[4658]: I1002 11:22:22.717081 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjkgd" event={"ID":"45634610-7bec-413b-8b11-3b90a851b37b","Type":"ContainerDied","Data":"a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200"} Oct 02 11:22:22 crc kubenswrapper[4658]: I1002 11:22:22.722074 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzf5c" event={"ID":"cc5605be-988f-43bc-b3e1-4d7346ef81cf","Type":"ContainerStarted","Data":"922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c"} Oct 02 11:22:22 crc kubenswrapper[4658]: I1002 11:22:22.762718 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fzf5c" podStartSLOduration=3.633412206 podStartE2EDuration="1m0.762698313s" podCreationTimestamp="2025-10-02 11:21:22 +0000 UTC" firstStartedPulling="2025-10-02 11:21:24.279657102 +0000 UTC m=+165.170810669" lastFinishedPulling="2025-10-02 11:22:21.408943209 +0000 UTC m=+222.300096776" observedRunningTime="2025-10-02 11:22:22.760640621 +0000 UTC m=+223.651794198" watchObservedRunningTime="2025-10-02 11:22:22.762698313 +0000 UTC m=+223.653851900" Oct 02 11:22:23 crc kubenswrapper[4658]: I1002 11:22:23.728156 4658 generic.go:334] "Generic (PLEG): container finished" podID="5e4d275a-1db6-471e-87f3-162c144e7586" containerID="e4101fa7d3aa0a5e8a1acf14252fd07ea256a136e9ca6c22c5196b189a6b3915" exitCode=0 Oct 02 11:22:23 crc kubenswrapper[4658]: I1002 11:22:23.728236 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nn4mk" event={"ID":"5e4d275a-1db6-471e-87f3-162c144e7586","Type":"ContainerDied","Data":"e4101fa7d3aa0a5e8a1acf14252fd07ea256a136e9ca6c22c5196b189a6b3915"} Oct 02 11:22:23 crc kubenswrapper[4658]: I1002 11:22:23.730427 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjkgd" event={"ID":"45634610-7bec-413b-8b11-3b90a851b37b","Type":"ContainerStarted","Data":"14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965"} Oct 02 11:22:23 crc kubenswrapper[4658]: I1002 11:22:23.774327 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tjkgd" podStartSLOduration=2.844986853 podStartE2EDuration="1m2.774279949s" podCreationTimestamp="2025-10-02 11:21:21 +0000 UTC" firstStartedPulling="2025-10-02 11:21:23.203931975 +0000 UTC m=+164.095085532" lastFinishedPulling="2025-10-02 11:22:23.133225061 +0000 UTC m=+224.024378628" observedRunningTime="2025-10-02 11:22:23.773891456 +0000 UTC m=+224.665045033" watchObservedRunningTime="2025-10-02 11:22:23.774279949 +0000 UTC m=+224.665433536" Oct 02 11:22:24 crc kubenswrapper[4658]: I1002 11:22:24.737680 4658 generic.go:334] "Generic (PLEG): container finished" podID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerID="c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b" exitCode=0 Oct 02 11:22:24 crc kubenswrapper[4658]: I1002 11:22:24.738054 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wchxv" event={"ID":"578b83fe-55ef-4dc7-8df1-d1e2fce37db8","Type":"ContainerDied","Data":"c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b"} Oct 02 11:22:24 crc kubenswrapper[4658]: I1002 11:22:24.741258 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nn4mk" event={"ID":"5e4d275a-1db6-471e-87f3-162c144e7586","Type":"ContainerStarted","Data":"90b4c5978b2ade7a61e030b752c77374be2dde5448572453a8edac852e8e77c2"} Oct 02 11:22:24 crc kubenswrapper[4658]: I1002 11:22:24.776139 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nn4mk" podStartSLOduration=2.690432631 podStartE2EDuration="1m5.776121182s" podCreationTimestamp="2025-10-02 11:21:19 +0000 UTC" firstStartedPulling="2025-10-02 11:21:21.06850135 +0000 UTC m=+161.959654917" lastFinishedPulling="2025-10-02 11:22:24.154189901 +0000 UTC m=+225.045343468" observedRunningTime="2025-10-02 11:22:24.774461485 +0000 UTC m=+225.665615072" watchObservedRunningTime="2025-10-02 11:22:24.776121182 +0000 UTC m=+225.667274749" Oct 02 11:22:25 crc kubenswrapper[4658]: I1002 11:22:25.748073 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wchxv" event={"ID":"578b83fe-55ef-4dc7-8df1-d1e2fce37db8","Type":"ContainerStarted","Data":"ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e"} Oct 02 11:22:25 crc kubenswrapper[4658]: I1002 11:22:25.750717 4658 generic.go:334] "Generic (PLEG): container finished" podID="dc132895-415b-49fb-86b7-63c32990a0cd" containerID="cd64e319b6d7d021cf85a95a99f157d1599336f53a74e96fe470bec42558548c" exitCode=0 Oct 02 11:22:25 crc kubenswrapper[4658]: I1002 11:22:25.750756 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvfxq" event={"ID":"dc132895-415b-49fb-86b7-63c32990a0cd","Type":"ContainerDied","Data":"cd64e319b6d7d021cf85a95a99f157d1599336f53a74e96fe470bec42558548c"} Oct 02 11:22:25 crc kubenswrapper[4658]: I1002 11:22:25.781524 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wchxv" podStartSLOduration=2.5948880450000003 podStartE2EDuration="1m6.781509857s" podCreationTimestamp="2025-10-02 11:21:19 +0000 UTC" firstStartedPulling="2025-10-02 11:21:21.060406102 +0000 UTC m=+161.951559669" lastFinishedPulling="2025-10-02 11:22:25.247027914 +0000 UTC m=+226.138181481" observedRunningTime="2025-10-02 11:22:25.779596931 +0000 UTC m=+226.670750498" watchObservedRunningTime="2025-10-02 11:22:25.781509857 +0000 UTC m=+226.672663424" Oct 02 11:22:26 crc kubenswrapper[4658]: I1002 11:22:26.757519 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrxpp" event={"ID":"10d795d1-5e35-42da-9cd9-9761be302b1b","Type":"ContainerStarted","Data":"a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74"} Oct 02 11:22:26 crc kubenswrapper[4658]: I1002 11:22:26.761038 4658 generic.go:334] "Generic (PLEG): container finished" podID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerID="436692a0b9efc66350b242e53034abbd68b369b63cfb24cdf2df5289e0c3d4d4" exitCode=0 Oct 02 11:22:26 crc kubenswrapper[4658]: I1002 11:22:26.761140 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j7kz" event={"ID":"b3286b84-35b9-4116-b8e9-e84fb5f50d23","Type":"ContainerDied","Data":"436692a0b9efc66350b242e53034abbd68b369b63cfb24cdf2df5289e0c3d4d4"} Oct 02 11:22:26 crc kubenswrapper[4658]: I1002 11:22:26.763871 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvfxq" event={"ID":"dc132895-415b-49fb-86b7-63c32990a0cd","Type":"ContainerStarted","Data":"be17b99ca1ae72dd23bbe29be3ef5dfde3de350f582a765e9f13b986f2202c39"} Oct 02 11:22:26 crc kubenswrapper[4658]: I1002 11:22:26.805840 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lvfxq" podStartSLOduration=2.601319664 podStartE2EDuration="1m4.80582412s" podCreationTimestamp="2025-10-02 11:21:22 +0000 UTC" firstStartedPulling="2025-10-02 11:21:24.242634762 +0000 UTC m=+165.133788329" lastFinishedPulling="2025-10-02 11:22:26.447139218 +0000 UTC m=+227.338292785" observedRunningTime="2025-10-02 11:22:26.80375875 +0000 UTC m=+227.694912337" watchObservedRunningTime="2025-10-02 11:22:26.80582412 +0000 UTC m=+227.696977687" Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.429919 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.430275 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.430369 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.430891 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.430935 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7" gracePeriod=600 Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.771130 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j7kz" event={"ID":"b3286b84-35b9-4116-b8e9-e84fb5f50d23","Type":"ContainerStarted","Data":"985a5f42297da0cbfee80221fa6005d69b5e3c57f85e0929fa8421ccb0ddd37f"} Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.773512 4658 generic.go:334] "Generic (PLEG): container finished" podID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerID="a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74" exitCode=0 Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.773588 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrxpp" event={"ID":"10d795d1-5e35-42da-9cd9-9761be302b1b","Type":"ContainerDied","Data":"a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74"} Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.776191 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7" exitCode=0 Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.776237 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7"} Oct 02 11:22:27 crc kubenswrapper[4658]: I1002 11:22:27.799362 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6j7kz" podStartSLOduration=2.722764872 podStartE2EDuration="1m8.799326628s" podCreationTimestamp="2025-10-02 11:21:19 +0000 UTC" firstStartedPulling="2025-10-02 11:21:21.104970211 +0000 UTC m=+161.996123778" lastFinishedPulling="2025-10-02 11:22:27.181531967 +0000 UTC m=+228.072685534" observedRunningTime="2025-10-02 11:22:27.793705905 +0000 UTC m=+228.684859472" watchObservedRunningTime="2025-10-02 11:22:27.799326628 +0000 UTC m=+228.690480255" Oct 02 11:22:28 crc kubenswrapper[4658]: I1002 11:22:28.784785 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"6e1ee24640cae00955a1eb1c09dda3a8adfd0722fb6bd8f5d27b0d22a6570dc7"} Oct 02 11:22:28 crc kubenswrapper[4658]: I1002 11:22:28.786758 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrxpp" event={"ID":"10d795d1-5e35-42da-9cd9-9761be302b1b","Type":"ContainerStarted","Data":"c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d"} Oct 02 11:22:28 crc kubenswrapper[4658]: I1002 11:22:28.828280 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lrxpp" podStartSLOduration=3.69417585 podStartE2EDuration="1m7.82825981s" podCreationTimestamp="2025-10-02 11:21:21 +0000 UTC" firstStartedPulling="2025-10-02 11:21:24.284390924 +0000 UTC m=+165.175544491" lastFinishedPulling="2025-10-02 11:22:28.418474884 +0000 UTC m=+229.309628451" observedRunningTime="2025-10-02 11:22:28.823185666 +0000 UTC m=+229.714339243" watchObservedRunningTime="2025-10-02 11:22:28.82825981 +0000 UTC m=+229.719413387" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.389830 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.389969 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.580930 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.581515 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.618185 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.619946 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.816332 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.816390 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.830526 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.875168 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.993855 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:22:29 crc kubenswrapper[4658]: I1002 11:22:29.994124 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:22:30 crc kubenswrapper[4658]: I1002 11:22:30.034671 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:22:30 crc kubenswrapper[4658]: I1002 11:22:30.842592 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:22:30 crc kubenswrapper[4658]: I1002 11:22:30.907762 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:22:31 crc kubenswrapper[4658]: I1002 11:22:31.556964 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:22:31 crc kubenswrapper[4658]: I1002 11:22:31.557355 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:22:31 crc kubenswrapper[4658]: I1002 11:22:31.618982 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:22:31 crc kubenswrapper[4658]: I1002 11:22:31.848753 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:22:31 crc kubenswrapper[4658]: I1002 11:22:31.972251 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:22:31 crc kubenswrapper[4658]: I1002 11:22:31.972331 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:22:32 crc kubenswrapper[4658]: I1002 11:22:32.017107 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:22:32 crc kubenswrapper[4658]: I1002 11:22:32.559821 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:22:32 crc kubenswrapper[4658]: I1002 11:22:32.560268 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:22:32 crc kubenswrapper[4658]: I1002 11:22:32.619100 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:22:32 crc kubenswrapper[4658]: I1002 11:22:32.875679 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:22:33 crc kubenswrapper[4658]: I1002 11:22:33.005510 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:22:33 crc kubenswrapper[4658]: I1002 11:22:33.005547 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:22:33 crc kubenswrapper[4658]: I1002 11:22:33.054753 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:22:33 crc kubenswrapper[4658]: I1002 11:22:33.592325 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nn4mk"] Oct 02 11:22:33 crc kubenswrapper[4658]: I1002 11:22:33.818222 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nn4mk" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" containerName="registry-server" containerID="cri-o://90b4c5978b2ade7a61e030b752c77374be2dde5448572453a8edac852e8e77c2" gracePeriod=2 Oct 02 11:22:33 crc kubenswrapper[4658]: I1002 11:22:33.867787 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:22:35 crc kubenswrapper[4658]: I1002 11:22:35.829124 4658 generic.go:334] "Generic (PLEG): container finished" podID="5e4d275a-1db6-471e-87f3-162c144e7586" containerID="90b4c5978b2ade7a61e030b752c77374be2dde5448572453a8edac852e8e77c2" exitCode=0 Oct 02 11:22:35 crc kubenswrapper[4658]: I1002 11:22:35.829172 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nn4mk" event={"ID":"5e4d275a-1db6-471e-87f3-162c144e7586","Type":"ContainerDied","Data":"90b4c5978b2ade7a61e030b752c77374be2dde5448572453a8edac852e8e77c2"} Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.003968 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvfxq"] Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.004709 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lvfxq" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" containerName="registry-server" containerID="cri-o://be17b99ca1ae72dd23bbe29be3ef5dfde3de350f582a765e9f13b986f2202c39" gracePeriod=2 Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.065453 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.086493 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-utilities\") pod \"5e4d275a-1db6-471e-87f3-162c144e7586\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.086614 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-catalog-content\") pod \"5e4d275a-1db6-471e-87f3-162c144e7586\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.086647 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m4wj\" (UniqueName: \"kubernetes.io/projected/5e4d275a-1db6-471e-87f3-162c144e7586-kube-api-access-2m4wj\") pod \"5e4d275a-1db6-471e-87f3-162c144e7586\" (UID: \"5e4d275a-1db6-471e-87f3-162c144e7586\") " Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.087641 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-utilities" (OuterVolumeSpecName: "utilities") pod "5e4d275a-1db6-471e-87f3-162c144e7586" (UID: "5e4d275a-1db6-471e-87f3-162c144e7586"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.098518 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4d275a-1db6-471e-87f3-162c144e7586-kube-api-access-2m4wj" (OuterVolumeSpecName: "kube-api-access-2m4wj") pod "5e4d275a-1db6-471e-87f3-162c144e7586" (UID: "5e4d275a-1db6-471e-87f3-162c144e7586"). InnerVolumeSpecName "kube-api-access-2m4wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.188617 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m4wj\" (UniqueName: \"kubernetes.io/projected/5e4d275a-1db6-471e-87f3-162c144e7586-kube-api-access-2m4wj\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.188878 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.842218 4658 generic.go:334] "Generic (PLEG): container finished" podID="dc132895-415b-49fb-86b7-63c32990a0cd" containerID="be17b99ca1ae72dd23bbe29be3ef5dfde3de350f582a765e9f13b986f2202c39" exitCode=0 Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.842284 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvfxq" event={"ID":"dc132895-415b-49fb-86b7-63c32990a0cd","Type":"ContainerDied","Data":"be17b99ca1ae72dd23bbe29be3ef5dfde3de350f582a765e9f13b986f2202c39"} Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.844816 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nn4mk" event={"ID":"5e4d275a-1db6-471e-87f3-162c144e7586","Type":"ContainerDied","Data":"596444e2df090e358deb8a9806c53b3dee1fdd7fecd9800dbccef1d857961d40"} Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.844880 4658 scope.go:117] "RemoveContainer" containerID="90b4c5978b2ade7a61e030b752c77374be2dde5448572453a8edac852e8e77c2" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.844912 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nn4mk" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.872785 4658 scope.go:117] "RemoveContainer" containerID="e4101fa7d3aa0a5e8a1acf14252fd07ea256a136e9ca6c22c5196b189a6b3915" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.890236 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e4d275a-1db6-471e-87f3-162c144e7586" (UID: "5e4d275a-1db6-471e-87f3-162c144e7586"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.892406 4658 scope.go:117] "RemoveContainer" containerID="14c65e3b2a5a75810b7a8dc0f3fc6449f864e3fb2ad89a346dbec261855007be" Oct 02 11:22:37 crc kubenswrapper[4658]: I1002 11:22:37.897698 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4d275a-1db6-471e-87f3-162c144e7586-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.161829 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nn4mk"] Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.165335 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nn4mk"] Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.421980 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.508330 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-catalog-content\") pod \"dc132895-415b-49fb-86b7-63c32990a0cd\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.508408 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-utilities\") pod \"dc132895-415b-49fb-86b7-63c32990a0cd\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.508458 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vmd7\" (UniqueName: \"kubernetes.io/projected/dc132895-415b-49fb-86b7-63c32990a0cd-kube-api-access-8vmd7\") pod \"dc132895-415b-49fb-86b7-63c32990a0cd\" (UID: \"dc132895-415b-49fb-86b7-63c32990a0cd\") " Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.511920 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-utilities" (OuterVolumeSpecName: "utilities") pod "dc132895-415b-49fb-86b7-63c32990a0cd" (UID: "dc132895-415b-49fb-86b7-63c32990a0cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.517619 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc132895-415b-49fb-86b7-63c32990a0cd-kube-api-access-8vmd7" (OuterVolumeSpecName: "kube-api-access-8vmd7") pod "dc132895-415b-49fb-86b7-63c32990a0cd" (UID: "dc132895-415b-49fb-86b7-63c32990a0cd"). InnerVolumeSpecName "kube-api-access-8vmd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.609866 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.609916 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vmd7\" (UniqueName: \"kubernetes.io/projected/dc132895-415b-49fb-86b7-63c32990a0cd-kube-api-access-8vmd7\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.855380 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvfxq" event={"ID":"dc132895-415b-49fb-86b7-63c32990a0cd","Type":"ContainerDied","Data":"e2ee882b0c77b7fc537fd66de4f6e2f680c829d7b35b3f44a8cf1b700c421b8d"} Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.855446 4658 scope.go:117] "RemoveContainer" containerID="be17b99ca1ae72dd23bbe29be3ef5dfde3de350f582a765e9f13b986f2202c39" Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.855948 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvfxq" Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.873365 4658 scope.go:117] "RemoveContainer" containerID="cd64e319b6d7d021cf85a95a99f157d1599336f53a74e96fe470bec42558548c" Oct 02 11:22:38 crc kubenswrapper[4658]: I1002 11:22:38.895356 4658 scope.go:117] "RemoveContainer" containerID="d2419bb356296951a88c9869f743592dba35f614c4cf29507d4679cbafb61ca1" Oct 02 11:22:39 crc kubenswrapper[4658]: I1002 11:22:39.411340 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc132895-415b-49fb-86b7-63c32990a0cd" (UID: "dc132895-415b-49fb-86b7-63c32990a0cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:22:39 crc kubenswrapper[4658]: I1002 11:22:39.420183 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc132895-415b-49fb-86b7-63c32990a0cd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:39 crc kubenswrapper[4658]: I1002 11:22:39.479834 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvfxq"] Oct 02 11:22:39 crc kubenswrapper[4658]: I1002 11:22:39.484215 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lvfxq"] Oct 02 11:22:39 crc kubenswrapper[4658]: I1002 11:22:39.867739 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:22:39 crc kubenswrapper[4658]: I1002 11:22:39.957111 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" path="/var/lib/kubelet/pods/5e4d275a-1db6-471e-87f3-162c144e7586/volumes" Oct 02 11:22:39 crc kubenswrapper[4658]: I1002 11:22:39.958086 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" path="/var/lib/kubelet/pods/dc132895-415b-49fb-86b7-63c32990a0cd/volumes" Oct 02 11:22:41 crc kubenswrapper[4658]: I1002 11:22:41.995790 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6j7kz"] Oct 02 11:22:41 crc kubenswrapper[4658]: I1002 11:22:41.996242 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6j7kz" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerName="registry-server" containerID="cri-o://985a5f42297da0cbfee80221fa6005d69b5e3c57f85e0929fa8421ccb0ddd37f" gracePeriod=2 Oct 02 11:22:42 crc kubenswrapper[4658]: I1002 11:22:42.017399 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:22:42 crc kubenswrapper[4658]: I1002 11:22:42.891508 4658 generic.go:334] "Generic (PLEG): container finished" podID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerID="985a5f42297da0cbfee80221fa6005d69b5e3c57f85e0929fa8421ccb0ddd37f" exitCode=0 Oct 02 11:22:42 crc kubenswrapper[4658]: I1002 11:22:42.891589 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j7kz" event={"ID":"b3286b84-35b9-4116-b8e9-e84fb5f50d23","Type":"ContainerDied","Data":"985a5f42297da0cbfee80221fa6005d69b5e3c57f85e0929fa8421ccb0ddd37f"} Oct 02 11:22:42 crc kubenswrapper[4658]: I1002 11:22:42.928509 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:22:42 crc kubenswrapper[4658]: I1002 11:22:42.966371 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-catalog-content\") pod \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " Oct 02 11:22:42 crc kubenswrapper[4658]: I1002 11:22:42.966430 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-utilities\") pod \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " Oct 02 11:22:42 crc kubenswrapper[4658]: I1002 11:22:42.966469 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m77dc\" (UniqueName: \"kubernetes.io/projected/b3286b84-35b9-4116-b8e9-e84fb5f50d23-kube-api-access-m77dc\") pod \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\" (UID: \"b3286b84-35b9-4116-b8e9-e84fb5f50d23\") " Oct 02 11:22:42 crc kubenswrapper[4658]: I1002 11:22:42.967530 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-utilities" (OuterVolumeSpecName: "utilities") pod "b3286b84-35b9-4116-b8e9-e84fb5f50d23" (UID: "b3286b84-35b9-4116-b8e9-e84fb5f50d23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4658]: I1002 11:22:42.971908 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3286b84-35b9-4116-b8e9-e84fb5f50d23-kube-api-access-m77dc" (OuterVolumeSpecName: "kube-api-access-m77dc") pod "b3286b84-35b9-4116-b8e9-e84fb5f50d23" (UID: "b3286b84-35b9-4116-b8e9-e84fb5f50d23"). InnerVolumeSpecName "kube-api-access-m77dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.014423 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3286b84-35b9-4116-b8e9-e84fb5f50d23" (UID: "b3286b84-35b9-4116-b8e9-e84fb5f50d23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.067873 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.067906 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3286b84-35b9-4116-b8e9-e84fb5f50d23-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.067918 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m77dc\" (UniqueName: \"kubernetes.io/projected/b3286b84-35b9-4116-b8e9-e84fb5f50d23-kube-api-access-m77dc\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.133617 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvclq"] Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.907406 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6j7kz" event={"ID":"b3286b84-35b9-4116-b8e9-e84fb5f50d23","Type":"ContainerDied","Data":"18903b8742ba185baa54a1109ba925a27ac90738823992ff5c368e9514e4148d"} Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.907490 4658 scope.go:117] "RemoveContainer" containerID="985a5f42297da0cbfee80221fa6005d69b5e3c57f85e0929fa8421ccb0ddd37f" Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.907599 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6j7kz" Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.936762 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6j7kz"] Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.938823 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6j7kz"] Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.942006 4658 scope.go:117] "RemoveContainer" containerID="436692a0b9efc66350b242e53034abbd68b369b63cfb24cdf2df5289e0c3d4d4" Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.955592 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" path="/var/lib/kubelet/pods/b3286b84-35b9-4116-b8e9-e84fb5f50d23/volumes" Oct 02 11:22:43 crc kubenswrapper[4658]: I1002 11:22:43.959082 4658 scope.go:117] "RemoveContainer" containerID="1f16b6471eecdc81738169c95b417626e7ea0a29df92740cf7bde2b790cca795" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.392959 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrxpp"] Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.394602 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lrxpp" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerName="registry-server" containerID="cri-o://c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d" gracePeriod=2 Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.768834 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.814921 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-catalog-content\") pod \"10d795d1-5e35-42da-9cd9-9761be302b1b\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.815032 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-utilities\") pod \"10d795d1-5e35-42da-9cd9-9761be302b1b\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.815073 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrhc8\" (UniqueName: \"kubernetes.io/projected/10d795d1-5e35-42da-9cd9-9761be302b1b-kube-api-access-hrhc8\") pod \"10d795d1-5e35-42da-9cd9-9761be302b1b\" (UID: \"10d795d1-5e35-42da-9cd9-9761be302b1b\") " Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.816232 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-utilities" (OuterVolumeSpecName: "utilities") pod "10d795d1-5e35-42da-9cd9-9761be302b1b" (UID: "10d795d1-5e35-42da-9cd9-9761be302b1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.828281 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10d795d1-5e35-42da-9cd9-9761be302b1b" (UID: "10d795d1-5e35-42da-9cd9-9761be302b1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.829435 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d795d1-5e35-42da-9cd9-9761be302b1b-kube-api-access-hrhc8" (OuterVolumeSpecName: "kube-api-access-hrhc8") pod "10d795d1-5e35-42da-9cd9-9761be302b1b" (UID: "10d795d1-5e35-42da-9cd9-9761be302b1b"). InnerVolumeSpecName "kube-api-access-hrhc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.917195 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.917239 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d795d1-5e35-42da-9cd9-9761be302b1b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.917249 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrhc8\" (UniqueName: \"kubernetes.io/projected/10d795d1-5e35-42da-9cd9-9761be302b1b-kube-api-access-hrhc8\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.924475 4658 generic.go:334] "Generic (PLEG): container finished" podID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerID="c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d" exitCode=0 Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.924521 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrxpp" event={"ID":"10d795d1-5e35-42da-9cd9-9761be302b1b","Type":"ContainerDied","Data":"c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d"} Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.924549 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrxpp" event={"ID":"10d795d1-5e35-42da-9cd9-9761be302b1b","Type":"ContainerDied","Data":"ecbdad5a576180f5f76f8f9a67814930ddbfbf9f66135be8be1d6549eb547fbc"} Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.924566 4658 scope.go:117] "RemoveContainer" containerID="c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.924677 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrxpp" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.943166 4658 scope.go:117] "RemoveContainer" containerID="a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.950980 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrxpp"] Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.954306 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrxpp"] Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.962364 4658 scope.go:117] "RemoveContainer" containerID="f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.993407 4658 scope.go:117] "RemoveContainer" containerID="c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d" Oct 02 11:22:46 crc kubenswrapper[4658]: E1002 11:22:46.993896 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d\": container with ID starting with c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d not found: ID does not exist" containerID="c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.993954 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d"} err="failed to get container status \"c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d\": rpc error: code = NotFound desc = could not find container \"c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d\": container with ID starting with c2a8276c13334a4d2dde65ccb7d6aa39108c14cc26eb1d3c04a0d38b5879e11d not found: ID does not exist" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.993990 4658 scope.go:117] "RemoveContainer" containerID="a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74" Oct 02 11:22:46 crc kubenswrapper[4658]: E1002 11:22:46.994434 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74\": container with ID starting with a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74 not found: ID does not exist" containerID="a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.994499 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74"} err="failed to get container status \"a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74\": rpc error: code = NotFound desc = could not find container \"a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74\": container with ID starting with a010711abfb8d3858efc7d83301ff87e62075d7a6e5b3fcdd0ae38b03bf82a74 not found: ID does not exist" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.994559 4658 scope.go:117] "RemoveContainer" containerID="f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872" Oct 02 11:22:46 crc kubenswrapper[4658]: E1002 11:22:46.994904 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872\": container with ID starting with f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872 not found: ID does not exist" containerID="f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872" Oct 02 11:22:46 crc kubenswrapper[4658]: I1002 11:22:46.994940 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872"} err="failed to get container status \"f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872\": rpc error: code = NotFound desc = could not find container \"f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872\": container with ID starting with f1f2c57df711083ffad619b34aeb5e86aa8b64696701fc31d620c53dd3f16872 not found: ID does not exist" Oct 02 11:22:47 crc kubenswrapper[4658]: I1002 11:22:47.956559 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" path="/var/lib/kubelet/pods/10d795d1-5e35-42da-9cd9-9761be302b1b/volumes" Oct 02 11:23:08 crc kubenswrapper[4658]: I1002 11:23:08.159591 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" podUID="fecb5f70-edd2-466b-a31f-25b1db79aec5" containerName="oauth-openshift" containerID="cri-o://b1dfeda52893d1d544ddc89cd1af4f0984c3fe3c578f0ae7bbe564efe7e26f33" gracePeriod=15 Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.040337 4658 generic.go:334] "Generic (PLEG): container finished" podID="fecb5f70-edd2-466b-a31f-25b1db79aec5" containerID="b1dfeda52893d1d544ddc89cd1af4f0984c3fe3c578f0ae7bbe564efe7e26f33" exitCode=0 Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.040681 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" event={"ID":"fecb5f70-edd2-466b-a31f-25b1db79aec5","Type":"ContainerDied","Data":"b1dfeda52893d1d544ddc89cd1af4f0984c3fe3c578f0ae7bbe564efe7e26f33"} Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.227834 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.265998 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-57dbc969ff-m2sbw"] Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266420 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerName="extract-content" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266467 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerName="extract-content" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266484 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerName="extract-content" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266493 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerName="extract-content" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266508 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266517 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266556 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" containerName="extract-content" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266565 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" containerName="extract-content" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266576 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266583 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266594 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f8f472-ba49-46d7-998d-627a4cf18df7" containerName="pruner" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266626 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f8f472-ba49-46d7-998d-627a4cf18df7" containerName="pruner" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266637 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecb5f70-edd2-466b-a31f-25b1db79aec5" containerName="oauth-openshift" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266645 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecb5f70-edd2-466b-a31f-25b1db79aec5" containerName="oauth-openshift" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266657 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c4d8047-922c-4594-8ba2-3624fc2e73c5" containerName="pruner" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266665 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c4d8047-922c-4594-8ba2-3624fc2e73c5" containerName="pruner" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266677 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" containerName="extract-utilities" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266686 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" containerName="extract-utilities" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266699 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266708 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266721 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266729 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266740 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerName="extract-utilities" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266748 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerName="extract-utilities" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266760 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" containerName="extract-utilities" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266768 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" containerName="extract-utilities" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266779 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" containerName="extract-content" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266787 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" containerName="extract-content" Oct 02 11:23:09 crc kubenswrapper[4658]: E1002 11:23:09.266859 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerName="extract-utilities" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266868 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerName="extract-utilities" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.266994 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3286b84-35b9-4116-b8e9-e84fb5f50d23" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.267006 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d795d1-5e35-42da-9cd9-9761be302b1b" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.267020 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f8f472-ba49-46d7-998d-627a4cf18df7" containerName="pruner" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.267032 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c4d8047-922c-4594-8ba2-3624fc2e73c5" containerName="pruner" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.267049 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecb5f70-edd2-466b-a31f-25b1db79aec5" containerName="oauth-openshift" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.267057 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc132895-415b-49fb-86b7-63c32990a0cd" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.267068 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4d275a-1db6-471e-87f3-162c144e7586" containerName="registry-server" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.267530 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.269750 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57dbc969ff-m2sbw"] Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.307745 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-dir\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.307808 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-cliconfig\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.307845 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-provider-selection\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.307883 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-router-certs\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.307901 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.307939 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-serving-cert\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.308780 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.307964 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-ocp-branding-template\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309052 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-policies\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309091 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-idp-0-file-data\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309132 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-trusted-ca-bundle\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309186 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-error\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309236 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-service-ca\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309352 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-login\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309394 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv8xv\" (UniqueName: \"kubernetes.io/projected/fecb5f70-edd2-466b-a31f-25b1db79aec5-kube-api-access-pv8xv\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309429 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-session\") pod \"fecb5f70-edd2-466b-a31f-25b1db79aec5\" (UID: \"fecb5f70-edd2-466b-a31f-25b1db79aec5\") " Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309513 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309587 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309620 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-service-ca\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309733 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-audit-policies\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309818 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309860 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309902 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-template-error\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309946 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309972 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-session\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.309996 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-router-certs\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.310025 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.310080 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.310102 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-audit-dir\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.310124 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrlrs\" (UniqueName: \"kubernetes.io/projected/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-kube-api-access-wrlrs\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.310154 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-template-login\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.310197 4658 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.310211 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.310223 4658 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.310911 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.311591 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.313048 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.313312 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.313366 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.313680 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.314442 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.317747 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.317791 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.317832 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.318308 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fecb5f70-edd2-466b-a31f-25b1db79aec5-kube-api-access-pv8xv" (OuterVolumeSpecName: "kube-api-access-pv8xv") pod "fecb5f70-edd2-466b-a31f-25b1db79aec5" (UID: "fecb5f70-edd2-466b-a31f-25b1db79aec5"). InnerVolumeSpecName "kube-api-access-pv8xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411474 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411539 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411574 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-template-error\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411598 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411623 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-session\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411647 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-router-certs\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411671 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411710 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-audit-dir\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411732 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411756 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrlrs\" (UniqueName: \"kubernetes.io/projected/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-kube-api-access-wrlrs\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411783 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-template-login\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411809 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411830 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-service-ca\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411884 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-audit-policies\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411965 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411980 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv8xv\" (UniqueName: \"kubernetes.io/projected/fecb5f70-edd2-466b-a31f-25b1db79aec5-kube-api-access-pv8xv\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.411993 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412008 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412022 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412034 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412046 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412059 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412072 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412084 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412097 4658 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fecb5f70-edd2-466b-a31f-25b1db79aec5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412509 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-audit-dir\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412882 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.412907 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-audit-policies\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.414152 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-service-ca\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.416214 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.416222 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-session\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.416361 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.418930 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.419066 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.419260 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-router-certs\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.419330 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.420258 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-template-login\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.428145 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrlrs\" (UniqueName: \"kubernetes.io/projected/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-kube-api-access-wrlrs\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.434254 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5b40b01-7477-4eef-95d3-ff2324c9cbdc-v4-0-config-user-template-error\") pod \"oauth-openshift-57dbc969ff-m2sbw\" (UID: \"b5b40b01-7477-4eef-95d3-ff2324c9cbdc\") " pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.592208 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:09 crc kubenswrapper[4658]: I1002 11:23:09.836989 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57dbc969ff-m2sbw"] Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.049703 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" event={"ID":"fecb5f70-edd2-466b-a31f-25b1db79aec5","Type":"ContainerDied","Data":"e11c8d2d98441577cdc5203468750ca9a69678ba385d41969073d8f00cfcd74e"} Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.050038 4658 scope.go:117] "RemoveContainer" containerID="b1dfeda52893d1d544ddc89cd1af4f0984c3fe3c578f0ae7bbe564efe7e26f33" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.049734 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wvclq" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.052116 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" event={"ID":"b5b40b01-7477-4eef-95d3-ff2324c9cbdc","Type":"ContainerStarted","Data":"1c1355030f5901fcc06d0a0536dc151c5b16b80420a50d5e43b2d2b402a16ace"} Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.070177 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvclq"] Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.076203 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvclq"] Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.931596 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.931653 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.931680 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.931760 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.933722 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.933857 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.934465 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.943198 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.944341 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.946100 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.956277 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.956559 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:23:10 crc kubenswrapper[4658]: I1002 11:23:10.961519 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:23:11 crc kubenswrapper[4658]: I1002 11:23:11.059245 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" event={"ID":"b5b40b01-7477-4eef-95d3-ff2324c9cbdc","Type":"ContainerStarted","Data":"df60572beb08414e07fb6f8dcd4971a83d249d7928d734f9020a955afc387b08"} Oct 02 11:23:11 crc kubenswrapper[4658]: I1002 11:23:11.059623 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:11 crc kubenswrapper[4658]: I1002 11:23:11.069462 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" Oct 02 11:23:11 crc kubenswrapper[4658]: I1002 11:23:11.082021 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57dbc969ff-m2sbw" podStartSLOduration=28.081995681 podStartE2EDuration="28.081995681s" podCreationTimestamp="2025-10-02 11:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:23:11.079788506 +0000 UTC m=+271.970942083" watchObservedRunningTime="2025-10-02 11:23:11.081995681 +0000 UTC m=+271.973149268" Oct 02 11:23:11 crc kubenswrapper[4658]: I1002 11:23:11.168236 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:23:11 crc kubenswrapper[4658]: I1002 11:23:11.171767 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:23:11 crc kubenswrapper[4658]: W1002 11:23:11.521893 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-121b799c252b6a7bfd38aeb3093ad92f4ef2de73504eef10312635f648ec35d3 WatchSource:0}: Error finding container 121b799c252b6a7bfd38aeb3093ad92f4ef2de73504eef10312635f648ec35d3: Status 404 returned error can't find the container with id 121b799c252b6a7bfd38aeb3093ad92f4ef2de73504eef10312635f648ec35d3 Oct 02 11:23:11 crc kubenswrapper[4658]: I1002 11:23:11.955377 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fecb5f70-edd2-466b-a31f-25b1db79aec5" path="/var/lib/kubelet/pods/fecb5f70-edd2-466b-a31f-25b1db79aec5/volumes" Oct 02 11:23:12 crc kubenswrapper[4658]: I1002 11:23:12.069273 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"61f77cf8ad496221b62c1903ca5534b341c53ad8144293be26ee97df1025a6e3"} Oct 02 11:23:12 crc kubenswrapper[4658]: I1002 11:23:12.069333 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e09b7407961663ee30b38dd4f8a76929fd06f938af2c88736f5fce63b3d163bf"} Oct 02 11:23:12 crc kubenswrapper[4658]: I1002 11:23:12.069450 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:23:12 crc kubenswrapper[4658]: I1002 11:23:12.070850 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b68b348d57af3d93ca63805b141777232c310f2eac429c274e1b6ffc4216a82b"} Oct 02 11:23:12 crc kubenswrapper[4658]: I1002 11:23:12.070875 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"121b799c252b6a7bfd38aeb3093ad92f4ef2de73504eef10312635f648ec35d3"} Oct 02 11:23:12 crc kubenswrapper[4658]: I1002 11:23:12.072201 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e97a4821a0112bdb5a3fb548812f1330c52188273c714393aec9c2505f67ca29"} Oct 02 11:23:12 crc kubenswrapper[4658]: I1002 11:23:12.072224 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e781b17f0c96c45622d89ae433fe9f7c5c78201712bf837312c4ba4ab567da58"} Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.577141 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wchxv"] Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.577982 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wchxv" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerName="registry-server" containerID="cri-o://ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e" gracePeriod=30 Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.584902 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qzshr"] Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.585146 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qzshr" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" containerName="registry-server" containerID="cri-o://d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c" gracePeriod=30 Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.593808 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q5bt5"] Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.594668 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" podUID="571e8f9f-9662-4139-9cf5-51093519d329" containerName="marketplace-operator" containerID="cri-o://d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859" gracePeriod=30 Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.605758 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjkgd"] Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.606201 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tjkgd" podUID="45634610-7bec-413b-8b11-3b90a851b37b" containerName="registry-server" containerID="cri-o://14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965" gracePeriod=30 Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.612040 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzf5c"] Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.612354 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fzf5c" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerName="registry-server" containerID="cri-o://922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c" gracePeriod=30 Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.614503 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clrx4"] Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.616106 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.625491 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clrx4"] Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.757025 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnplx\" (UniqueName: \"kubernetes.io/projected/0cdd5f96-dd0d-4f77-8e41-83a8493dbca7-kube-api-access-lnplx\") pod \"marketplace-operator-79b997595-clrx4\" (UID: \"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7\") " pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.757113 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0cdd5f96-dd0d-4f77-8e41-83a8493dbca7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clrx4\" (UID: \"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7\") " pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.757147 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cdd5f96-dd0d-4f77-8e41-83a8493dbca7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clrx4\" (UID: \"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7\") " pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.858342 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0cdd5f96-dd0d-4f77-8e41-83a8493dbca7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clrx4\" (UID: \"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7\") " pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.858712 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cdd5f96-dd0d-4f77-8e41-83a8493dbca7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clrx4\" (UID: \"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7\") " pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.858809 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnplx\" (UniqueName: \"kubernetes.io/projected/0cdd5f96-dd0d-4f77-8e41-83a8493dbca7-kube-api-access-lnplx\") pod \"marketplace-operator-79b997595-clrx4\" (UID: \"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7\") " pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.869173 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0cdd5f96-dd0d-4f77-8e41-83a8493dbca7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clrx4\" (UID: \"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7\") " pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.869895 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cdd5f96-dd0d-4f77-8e41-83a8493dbca7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clrx4\" (UID: \"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7\") " pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.883065 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnplx\" (UniqueName: \"kubernetes.io/projected/0cdd5f96-dd0d-4f77-8e41-83a8493dbca7-kube-api-access-lnplx\") pod \"marketplace-operator-79b997595-clrx4\" (UID: \"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7\") " pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:27 crc kubenswrapper[4658]: I1002 11:23:27.934109 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.008704 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.015573 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.018067 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.019645 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.044527 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.150200 4658 generic.go:334] "Generic (PLEG): container finished" podID="571e8f9f-9662-4139-9cf5-51093519d329" containerID="d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859" exitCode=0 Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.150287 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" event={"ID":"571e8f9f-9662-4139-9cf5-51093519d329","Type":"ContainerDied","Data":"d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.150334 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" event={"ID":"571e8f9f-9662-4139-9cf5-51093519d329","Type":"ContainerDied","Data":"5ebc857c1bbf55c9940f46beb8849aa4a4e146d79dfab92168ef0551ebd5d716"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.150355 4658 scope.go:117] "RemoveContainer" containerID="d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.150485 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q5bt5" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.153702 4658 generic.go:334] "Generic (PLEG): container finished" podID="ed79e082-6f4d-418d-bf20-621fb495976a" containerID="d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c" exitCode=0 Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.153745 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzshr" event={"ID":"ed79e082-6f4d-418d-bf20-621fb495976a","Type":"ContainerDied","Data":"d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.153767 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzshr" event={"ID":"ed79e082-6f4d-418d-bf20-621fb495976a","Type":"ContainerDied","Data":"b851029894a8f17d7b4e1a125750c5e12948e8e4322790dd257a6ce8c039e29a"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.153832 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzshr" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180411 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-trusted-ca\") pod \"571e8f9f-9662-4139-9cf5-51093519d329\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180460 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wllx\" (UniqueName: \"kubernetes.io/projected/cc5605be-988f-43bc-b3e1-4d7346ef81cf-kube-api-access-8wllx\") pod \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180494 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-catalog-content\") pod \"45634610-7bec-413b-8b11-3b90a851b37b\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180525 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-utilities\") pod \"ed79e082-6f4d-418d-bf20-621fb495976a\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180575 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-catalog-content\") pod \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180607 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mb44\" (UniqueName: \"kubernetes.io/projected/45634610-7bec-413b-8b11-3b90a851b37b-kube-api-access-2mb44\") pod \"45634610-7bec-413b-8b11-3b90a851b37b\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180641 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7xx4\" (UniqueName: \"kubernetes.io/projected/ed79e082-6f4d-418d-bf20-621fb495976a-kube-api-access-z7xx4\") pod \"ed79e082-6f4d-418d-bf20-621fb495976a\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180685 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-operator-metrics\") pod \"571e8f9f-9662-4139-9cf5-51093519d329\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180710 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66zsq\" (UniqueName: \"kubernetes.io/projected/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-kube-api-access-66zsq\") pod \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180734 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-catalog-content\") pod \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180771 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-utilities\") pod \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\" (UID: \"578b83fe-55ef-4dc7-8df1-d1e2fce37db8\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180794 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdpss\" (UniqueName: \"kubernetes.io/projected/571e8f9f-9662-4139-9cf5-51093519d329-kube-api-access-kdpss\") pod \"571e8f9f-9662-4139-9cf5-51093519d329\" (UID: \"571e8f9f-9662-4139-9cf5-51093519d329\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180815 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-catalog-content\") pod \"ed79e082-6f4d-418d-bf20-621fb495976a\" (UID: \"ed79e082-6f4d-418d-bf20-621fb495976a\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180864 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-utilities\") pod \"45634610-7bec-413b-8b11-3b90a851b37b\" (UID: \"45634610-7bec-413b-8b11-3b90a851b37b\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.180885 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-utilities\") pod \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\" (UID: \"cc5605be-988f-43bc-b3e1-4d7346ef81cf\") " Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.181197 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "571e8f9f-9662-4139-9cf5-51093519d329" (UID: "571e8f9f-9662-4139-9cf5-51093519d329"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.182657 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-utilities" (OuterVolumeSpecName: "utilities") pod "578b83fe-55ef-4dc7-8df1-d1e2fce37db8" (UID: "578b83fe-55ef-4dc7-8df1-d1e2fce37db8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.183074 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-utilities" (OuterVolumeSpecName: "utilities") pod "cc5605be-988f-43bc-b3e1-4d7346ef81cf" (UID: "cc5605be-988f-43bc-b3e1-4d7346ef81cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.185267 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-utilities" (OuterVolumeSpecName: "utilities") pod "45634610-7bec-413b-8b11-3b90a851b37b" (UID: "45634610-7bec-413b-8b11-3b90a851b37b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.186518 4658 generic.go:334] "Generic (PLEG): container finished" podID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerID="ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e" exitCode=0 Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.186676 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wchxv" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.186967 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-utilities" (OuterVolumeSpecName: "utilities") pod "ed79e082-6f4d-418d-bf20-621fb495976a" (UID: "ed79e082-6f4d-418d-bf20-621fb495976a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.186982 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wchxv" event={"ID":"578b83fe-55ef-4dc7-8df1-d1e2fce37db8","Type":"ContainerDied","Data":"ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.187015 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wchxv" event={"ID":"578b83fe-55ef-4dc7-8df1-d1e2fce37db8","Type":"ContainerDied","Data":"03f90ea38b8cc9b6ff583652490b27c345baf08c30c46c21a8d1b6bfa1a73ae9"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.187780 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "571e8f9f-9662-4139-9cf5-51093519d329" (UID: "571e8f9f-9662-4139-9cf5-51093519d329"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.188962 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45634610-7bec-413b-8b11-3b90a851b37b-kube-api-access-2mb44" (OuterVolumeSpecName: "kube-api-access-2mb44") pod "45634610-7bec-413b-8b11-3b90a851b37b" (UID: "45634610-7bec-413b-8b11-3b90a851b37b"). InnerVolumeSpecName "kube-api-access-2mb44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.189634 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-kube-api-access-66zsq" (OuterVolumeSpecName: "kube-api-access-66zsq") pod "578b83fe-55ef-4dc7-8df1-d1e2fce37db8" (UID: "578b83fe-55ef-4dc7-8df1-d1e2fce37db8"). InnerVolumeSpecName "kube-api-access-66zsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.193533 4658 generic.go:334] "Generic (PLEG): container finished" podID="45634610-7bec-413b-8b11-3b90a851b37b" containerID="14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965" exitCode=0 Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.193654 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjkgd" event={"ID":"45634610-7bec-413b-8b11-3b90a851b37b","Type":"ContainerDied","Data":"14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.193703 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjkgd" event={"ID":"45634610-7bec-413b-8b11-3b90a851b37b","Type":"ContainerDied","Data":"79d41d4f7abd2b7392336092511933b6dec3ec6ab9920e0436134b7c037bae3c"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.193956 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjkgd" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.195758 4658 scope.go:117] "RemoveContainer" containerID="d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.196224 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859\": container with ID starting with d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859 not found: ID does not exist" containerID="d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.196289 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859"} err="failed to get container status \"d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859\": rpc error: code = NotFound desc = could not find container \"d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859\": container with ID starting with d3a4f92a6424e91a7c054b12b3d5a79cf7d61f15090c02e942e999fb23bac859 not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.196343 4658 scope.go:117] "RemoveContainer" containerID="d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.196448 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed79e082-6f4d-418d-bf20-621fb495976a-kube-api-access-z7xx4" (OuterVolumeSpecName: "kube-api-access-z7xx4") pod "ed79e082-6f4d-418d-bf20-621fb495976a" (UID: "ed79e082-6f4d-418d-bf20-621fb495976a"). InnerVolumeSpecName "kube-api-access-z7xx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.197274 4658 generic.go:334] "Generic (PLEG): container finished" podID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerID="922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c" exitCode=0 Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.197356 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzf5c" event={"ID":"cc5605be-988f-43bc-b3e1-4d7346ef81cf","Type":"ContainerDied","Data":"922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.197390 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzf5c" event={"ID":"cc5605be-988f-43bc-b3e1-4d7346ef81cf","Type":"ContainerDied","Data":"ce9df9ce731de7ee46008067c08351ae97e88667035504a5c060aafbaaa3fa84"} Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.197475 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzf5c" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.198849 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571e8f9f-9662-4139-9cf5-51093519d329-kube-api-access-kdpss" (OuterVolumeSpecName: "kube-api-access-kdpss") pod "571e8f9f-9662-4139-9cf5-51093519d329" (UID: "571e8f9f-9662-4139-9cf5-51093519d329"). InnerVolumeSpecName "kube-api-access-kdpss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.199079 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45634610-7bec-413b-8b11-3b90a851b37b" (UID: "45634610-7bec-413b-8b11-3b90a851b37b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.205865 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc5605be-988f-43bc-b3e1-4d7346ef81cf-kube-api-access-8wllx" (OuterVolumeSpecName: "kube-api-access-8wllx") pod "cc5605be-988f-43bc-b3e1-4d7346ef81cf" (UID: "cc5605be-988f-43bc-b3e1-4d7346ef81cf"). InnerVolumeSpecName "kube-api-access-8wllx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.226025 4658 scope.go:117] "RemoveContainer" containerID="9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.244814 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "578b83fe-55ef-4dc7-8df1-d1e2fce37db8" (UID: "578b83fe-55ef-4dc7-8df1-d1e2fce37db8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.255998 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clrx4"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.262865 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed79e082-6f4d-418d-bf20-621fb495976a" (UID: "ed79e082-6f4d-418d-bf20-621fb495976a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.264560 4658 scope.go:117] "RemoveContainer" containerID="1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741" Oct 02 11:23:28 crc kubenswrapper[4658]: W1002 11:23:28.267869 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cdd5f96_dd0d_4f77_8e41_83a8493dbca7.slice/crio-0145dac9328f730eb914e0738c1e405a4f18dad993e0e9ff1b3bffc7fc64af8d WatchSource:0}: Error finding container 0145dac9328f730eb914e0738c1e405a4f18dad993e0e9ff1b3bffc7fc64af8d: Status 404 returned error can't find the container with id 0145dac9328f730eb914e0738c1e405a4f18dad993e0e9ff1b3bffc7fc64af8d Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.278957 4658 scope.go:117] "RemoveContainer" containerID="d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.279475 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c\": container with ID starting with d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c not found: ID does not exist" containerID="d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.279505 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c"} err="failed to get container status \"d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c\": rpc error: code = NotFound desc = could not find container \"d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c\": container with ID starting with d47dceffe71f7b5988500e8d92d725fe6b000d56c152261b96cc4dfbff90f75c not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.279528 4658 scope.go:117] "RemoveContainer" containerID="9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.279759 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf\": container with ID starting with 9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf not found: ID does not exist" containerID="9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.279776 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf"} err="failed to get container status \"9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf\": rpc error: code = NotFound desc = could not find container \"9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf\": container with ID starting with 9028d94a6af94f85ceafa7dfa5ae6e3941357aff7ca9e7cc5b965c2478c609cf not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.279787 4658 scope.go:117] "RemoveContainer" containerID="1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.279968 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741\": container with ID starting with 1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741 not found: ID does not exist" containerID="1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.279980 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741"} err="failed to get container status \"1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741\": rpc error: code = NotFound desc = could not find container \"1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741\": container with ID starting with 1a8adee04125a9a6e1f844e2372c902636f4f04d6c0237c4ba1d7fee157b5741 not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.279991 4658 scope.go:117] "RemoveContainer" containerID="ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283097 4658 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283196 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66zsq\" (UniqueName: \"kubernetes.io/projected/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-kube-api-access-66zsq\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283227 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283270 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/578b83fe-55ef-4dc7-8df1-d1e2fce37db8-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283284 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdpss\" (UniqueName: \"kubernetes.io/projected/571e8f9f-9662-4139-9cf5-51093519d329-kube-api-access-kdpss\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283552 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283574 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283645 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283873 4658 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/571e8f9f-9662-4139-9cf5-51093519d329-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283890 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wllx\" (UniqueName: \"kubernetes.io/projected/cc5605be-988f-43bc-b3e1-4d7346ef81cf-kube-api-access-8wllx\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283985 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45634610-7bec-413b-8b11-3b90a851b37b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.283999 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed79e082-6f4d-418d-bf20-621fb495976a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.284010 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mb44\" (UniqueName: \"kubernetes.io/projected/45634610-7bec-413b-8b11-3b90a851b37b-kube-api-access-2mb44\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.284090 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7xx4\" (UniqueName: \"kubernetes.io/projected/ed79e082-6f4d-418d-bf20-621fb495976a-kube-api-access-z7xx4\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.295724 4658 scope.go:117] "RemoveContainer" containerID="c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.300195 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc5605be-988f-43bc-b3e1-4d7346ef81cf" (UID: "cc5605be-988f-43bc-b3e1-4d7346ef81cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.309748 4658 scope.go:117] "RemoveContainer" containerID="aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.327197 4658 scope.go:117] "RemoveContainer" containerID="ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.327591 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e\": container with ID starting with ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e not found: ID does not exist" containerID="ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.327642 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e"} err="failed to get container status \"ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e\": rpc error: code = NotFound desc = could not find container \"ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e\": container with ID starting with ec55a619e50c3c373d3f2de6b7cb40666b39844e3e9a25af0c28a59004e4f91e not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.327673 4658 scope.go:117] "RemoveContainer" containerID="c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.327968 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b\": container with ID starting with c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b not found: ID does not exist" containerID="c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.327997 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b"} err="failed to get container status \"c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b\": rpc error: code = NotFound desc = could not find container \"c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b\": container with ID starting with c3c29c92914def2b1c409bb6521e0c67198f8892ec26d9a194b421e10160ab0b not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.328018 4658 scope.go:117] "RemoveContainer" containerID="aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.328211 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc\": container with ID starting with aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc not found: ID does not exist" containerID="aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.328232 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc"} err="failed to get container status \"aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc\": rpc error: code = NotFound desc = could not find container \"aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc\": container with ID starting with aa84e077eeac8b7827f53e91a5f535a9acf78a06bc3f3b3abd9e2f8c8fa247fc not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.328245 4658 scope.go:117] "RemoveContainer" containerID="14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.344354 4658 scope.go:117] "RemoveContainer" containerID="a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.361576 4658 scope.go:117] "RemoveContainer" containerID="bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.378061 4658 scope.go:117] "RemoveContainer" containerID="14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.380797 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965\": container with ID starting with 14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965 not found: ID does not exist" containerID="14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.380848 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965"} err="failed to get container status \"14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965\": rpc error: code = NotFound desc = could not find container \"14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965\": container with ID starting with 14c8a6af587b1e0c202976eb88243d4b2ff5db4739d7dc4f40eb0dae91790965 not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.380880 4658 scope.go:117] "RemoveContainer" containerID="a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.381374 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200\": container with ID starting with a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200 not found: ID does not exist" containerID="a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.381409 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200"} err="failed to get container status \"a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200\": rpc error: code = NotFound desc = could not find container \"a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200\": container with ID starting with a5a24e85c6bb39e45ab3932272c1654a3cc1a88b7daefce04fc3e617f3339200 not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.381437 4658 scope.go:117] "RemoveContainer" containerID="bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.381797 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135\": container with ID starting with bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135 not found: ID does not exist" containerID="bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.381826 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135"} err="failed to get container status \"bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135\": rpc error: code = NotFound desc = could not find container \"bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135\": container with ID starting with bbff5daa5378c068ff20adccbb779407ac790fe3788f93bdfa1fef4155073135 not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.381845 4658 scope.go:117] "RemoveContainer" containerID="922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.384864 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc5605be-988f-43bc-b3e1-4d7346ef81cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.404134 4658 scope.go:117] "RemoveContainer" containerID="3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.418306 4658 scope.go:117] "RemoveContainer" containerID="f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.431003 4658 scope.go:117] "RemoveContainer" containerID="922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.431352 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c\": container with ID starting with 922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c not found: ID does not exist" containerID="922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.431407 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c"} err="failed to get container status \"922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c\": rpc error: code = NotFound desc = could not find container \"922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c\": container with ID starting with 922ea88680571ad0d0a630029cb77f8b4910ed3ad74f130ba1c12a583a95ca9c not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.431448 4658 scope.go:117] "RemoveContainer" containerID="3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.431863 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c\": container with ID starting with 3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c not found: ID does not exist" containerID="3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.431891 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c"} err="failed to get container status \"3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c\": rpc error: code = NotFound desc = could not find container \"3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c\": container with ID starting with 3f34b4e8dcde208e803229327464472fe432850b50343a9f0d66812d2376ff0c not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.431911 4658 scope.go:117] "RemoveContainer" containerID="f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7" Oct 02 11:23:28 crc kubenswrapper[4658]: E1002 11:23:28.432177 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7\": container with ID starting with f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7 not found: ID does not exist" containerID="f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.432206 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7"} err="failed to get container status \"f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7\": rpc error: code = NotFound desc = could not find container \"f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7\": container with ID starting with f1b7bac6fa01a2ee1a5643f17a4b34f6b20bbc1239fd86baa9e77e516b0f84b7 not found: ID does not exist" Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.479652 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q5bt5"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.485040 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q5bt5"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.488324 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qzshr"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.490397 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qzshr"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.517184 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wchxv"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.526810 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wchxv"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.530416 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzf5c"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.531747 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fzf5c"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.539784 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjkgd"] Oct 02 11:23:28 crc kubenswrapper[4658]: I1002 11:23:28.542098 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjkgd"] Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.205845 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" event={"ID":"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7","Type":"ContainerStarted","Data":"0a83ab110c06944bb8f280df1fdc689df9b9b3c1dfd46b9b6186532cc1d74d70"} Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.206893 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.206976 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" event={"ID":"0cdd5f96-dd0d-4f77-8e41-83a8493dbca7","Type":"ContainerStarted","Data":"0145dac9328f730eb914e0738c1e405a4f18dad993e0e9ff1b3bffc7fc64af8d"} Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.210989 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.225816 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-clrx4" podStartSLOduration=2.225785583 podStartE2EDuration="2.225785583s" podCreationTimestamp="2025-10-02 11:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:23:29.222603035 +0000 UTC m=+290.113756602" watchObservedRunningTime="2025-10-02 11:23:29.225785583 +0000 UTC m=+290.116939170" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.791915 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7v4fx"] Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792411 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerName="extract-content" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792423 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerName="extract-content" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792433 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45634610-7bec-413b-8b11-3b90a851b37b" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792439 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="45634610-7bec-413b-8b11-3b90a851b37b" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792447 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerName="extract-utilities" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792454 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerName="extract-utilities" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792463 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" containerName="extract-content" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792469 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" containerName="extract-content" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792476 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerName="extract-utilities" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792482 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerName="extract-utilities" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792492 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792498 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792506 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792512 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792521 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571e8f9f-9662-4139-9cf5-51093519d329" containerName="marketplace-operator" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792527 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="571e8f9f-9662-4139-9cf5-51093519d329" containerName="marketplace-operator" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792533 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45634610-7bec-413b-8b11-3b90a851b37b" containerName="extract-content" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792539 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="45634610-7bec-413b-8b11-3b90a851b37b" containerName="extract-content" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792545 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792550 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792558 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45634610-7bec-413b-8b11-3b90a851b37b" containerName="extract-utilities" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792564 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="45634610-7bec-413b-8b11-3b90a851b37b" containerName="extract-utilities" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792571 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerName="extract-content" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792576 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerName="extract-content" Oct 02 11:23:29 crc kubenswrapper[4658]: E1002 11:23:29.792582 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" containerName="extract-utilities" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792588 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" containerName="extract-utilities" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792668 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792680 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792693 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="571e8f9f-9662-4139-9cf5-51093519d329" containerName="marketplace-operator" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792701 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.792708 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="45634610-7bec-413b-8b11-3b90a851b37b" containerName="registry-server" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.793378 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.796274 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.800883 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjbcq\" (UniqueName: \"kubernetes.io/projected/1cab9c15-8dc5-46cf-bb34-84ea996f0cc6-kube-api-access-rjbcq\") pod \"redhat-marketplace-7v4fx\" (UID: \"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6\") " pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.800944 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cab9c15-8dc5-46cf-bb34-84ea996f0cc6-utilities\") pod \"redhat-marketplace-7v4fx\" (UID: \"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6\") " pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.800976 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cab9c15-8dc5-46cf-bb34-84ea996f0cc6-catalog-content\") pod \"redhat-marketplace-7v4fx\" (UID: \"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6\") " pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.804574 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7v4fx"] Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.902572 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjbcq\" (UniqueName: \"kubernetes.io/projected/1cab9c15-8dc5-46cf-bb34-84ea996f0cc6-kube-api-access-rjbcq\") pod \"redhat-marketplace-7v4fx\" (UID: \"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6\") " pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.902666 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cab9c15-8dc5-46cf-bb34-84ea996f0cc6-utilities\") pod \"redhat-marketplace-7v4fx\" (UID: \"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6\") " pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.902878 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cab9c15-8dc5-46cf-bb34-84ea996f0cc6-catalog-content\") pod \"redhat-marketplace-7v4fx\" (UID: \"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6\") " pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.903417 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cab9c15-8dc5-46cf-bb34-84ea996f0cc6-catalog-content\") pod \"redhat-marketplace-7v4fx\" (UID: \"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6\") " pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.903672 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cab9c15-8dc5-46cf-bb34-84ea996f0cc6-utilities\") pod \"redhat-marketplace-7v4fx\" (UID: \"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6\") " pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.919863 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjbcq\" (UniqueName: \"kubernetes.io/projected/1cab9c15-8dc5-46cf-bb34-84ea996f0cc6-kube-api-access-rjbcq\") pod \"redhat-marketplace-7v4fx\" (UID: \"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6\") " pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.956473 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45634610-7bec-413b-8b11-3b90a851b37b" path="/var/lib/kubelet/pods/45634610-7bec-413b-8b11-3b90a851b37b/volumes" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.957911 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571e8f9f-9662-4139-9cf5-51093519d329" path="/var/lib/kubelet/pods/571e8f9f-9662-4139-9cf5-51093519d329/volumes" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.960459 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="578b83fe-55ef-4dc7-8df1-d1e2fce37db8" path="/var/lib/kubelet/pods/578b83fe-55ef-4dc7-8df1-d1e2fce37db8/volumes" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.962506 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc5605be-988f-43bc-b3e1-4d7346ef81cf" path="/var/lib/kubelet/pods/cc5605be-988f-43bc-b3e1-4d7346ef81cf/volumes" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.963715 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed79e082-6f4d-418d-bf20-621fb495976a" path="/var/lib/kubelet/pods/ed79e082-6f4d-418d-bf20-621fb495976a/volumes" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.989617 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-czdjc"] Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.990475 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:29 crc kubenswrapper[4658]: I1002 11:23:29.995406 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.003529 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cf74ad0-2d22-4e96-a77f-0df6ee38dfde-utilities\") pod \"redhat-operators-czdjc\" (UID: \"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde\") " pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.003577 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cf74ad0-2d22-4e96-a77f-0df6ee38dfde-catalog-content\") pod \"redhat-operators-czdjc\" (UID: \"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde\") " pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.003652 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5f2\" (UniqueName: \"kubernetes.io/projected/4cf74ad0-2d22-4e96-a77f-0df6ee38dfde-kube-api-access-fl5f2\") pod \"redhat-operators-czdjc\" (UID: \"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde\") " pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.004316 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-czdjc"] Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.104693 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cf74ad0-2d22-4e96-a77f-0df6ee38dfde-catalog-content\") pod \"redhat-operators-czdjc\" (UID: \"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde\") " pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.104814 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5f2\" (UniqueName: \"kubernetes.io/projected/4cf74ad0-2d22-4e96-a77f-0df6ee38dfde-kube-api-access-fl5f2\") pod \"redhat-operators-czdjc\" (UID: \"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde\") " pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.104861 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cf74ad0-2d22-4e96-a77f-0df6ee38dfde-utilities\") pod \"redhat-operators-czdjc\" (UID: \"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde\") " pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.105188 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cf74ad0-2d22-4e96-a77f-0df6ee38dfde-catalog-content\") pod \"redhat-operators-czdjc\" (UID: \"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde\") " pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.105340 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cf74ad0-2d22-4e96-a77f-0df6ee38dfde-utilities\") pod \"redhat-operators-czdjc\" (UID: \"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde\") " pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.126718 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.131315 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5f2\" (UniqueName: \"kubernetes.io/projected/4cf74ad0-2d22-4e96-a77f-0df6ee38dfde-kube-api-access-fl5f2\") pod \"redhat-operators-czdjc\" (UID: \"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde\") " pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.322208 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.326500 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7v4fx"] Oct 02 11:23:30 crc kubenswrapper[4658]: W1002 11:23:30.342659 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cab9c15_8dc5_46cf_bb34_84ea996f0cc6.slice/crio-a4bc68829fdcca652b428fdb4100affeeb86c4e8e74f9772cb9ca85010ad5dea WatchSource:0}: Error finding container a4bc68829fdcca652b428fdb4100affeeb86c4e8e74f9772cb9ca85010ad5dea: Status 404 returned error can't find the container with id a4bc68829fdcca652b428fdb4100affeeb86c4e8e74f9772cb9ca85010ad5dea Oct 02 11:23:30 crc kubenswrapper[4658]: I1002 11:23:30.526346 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-czdjc"] Oct 02 11:23:30 crc kubenswrapper[4658]: W1002 11:23:30.574198 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf74ad0_2d22_4e96_a77f_0df6ee38dfde.slice/crio-09063e19c54a4059f6202f7ae101285b692f70d26c93e1f077cbad97f6c75f67 WatchSource:0}: Error finding container 09063e19c54a4059f6202f7ae101285b692f70d26c93e1f077cbad97f6c75f67: Status 404 returned error can't find the container with id 09063e19c54a4059f6202f7ae101285b692f70d26c93e1f077cbad97f6c75f67 Oct 02 11:23:31 crc kubenswrapper[4658]: I1002 11:23:31.227904 4658 generic.go:334] "Generic (PLEG): container finished" podID="1cab9c15-8dc5-46cf-bb34-84ea996f0cc6" containerID="a2517746deef5738f009e62a509995a883cc8f1794d4d81e5653a95295296555" exitCode=0 Oct 02 11:23:31 crc kubenswrapper[4658]: I1002 11:23:31.228000 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7v4fx" event={"ID":"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6","Type":"ContainerDied","Data":"a2517746deef5738f009e62a509995a883cc8f1794d4d81e5653a95295296555"} Oct 02 11:23:31 crc kubenswrapper[4658]: I1002 11:23:31.228040 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7v4fx" event={"ID":"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6","Type":"ContainerStarted","Data":"a4bc68829fdcca652b428fdb4100affeeb86c4e8e74f9772cb9ca85010ad5dea"} Oct 02 11:23:31 crc kubenswrapper[4658]: I1002 11:23:31.229655 4658 generic.go:334] "Generic (PLEG): container finished" podID="4cf74ad0-2d22-4e96-a77f-0df6ee38dfde" containerID="24906d211e391b718d450d322c8cc2b3ebfe12bcb64acf9c29f420e8f6d0cce2" exitCode=0 Oct 02 11:23:31 crc kubenswrapper[4658]: I1002 11:23:31.230250 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czdjc" event={"ID":"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde","Type":"ContainerDied","Data":"24906d211e391b718d450d322c8cc2b3ebfe12bcb64acf9c29f420e8f6d0cce2"} Oct 02 11:23:31 crc kubenswrapper[4658]: I1002 11:23:31.230326 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czdjc" event={"ID":"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde","Type":"ContainerStarted","Data":"09063e19c54a4059f6202f7ae101285b692f70d26c93e1f077cbad97f6c75f67"} Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.198534 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2bgnl"] Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.200758 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.203054 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.209176 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bgnl"] Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.229250 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2sj4\" (UniqueName: \"kubernetes.io/projected/6c4f50a2-0ec0-44db-9817-8b3116a2415b-kube-api-access-h2sj4\") pod \"certified-operators-2bgnl\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.229319 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-catalog-content\") pod \"certified-operators-2bgnl\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.229404 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-utilities\") pod \"certified-operators-2bgnl\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.330456 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2sj4\" (UniqueName: \"kubernetes.io/projected/6c4f50a2-0ec0-44db-9817-8b3116a2415b-kube-api-access-h2sj4\") pod \"certified-operators-2bgnl\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.330521 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-catalog-content\") pod \"certified-operators-2bgnl\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.330545 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-utilities\") pod \"certified-operators-2bgnl\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.331175 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-utilities\") pod \"certified-operators-2bgnl\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.332946 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-catalog-content\") pod \"certified-operators-2bgnl\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.351187 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2sj4\" (UniqueName: \"kubernetes.io/projected/6c4f50a2-0ec0-44db-9817-8b3116a2415b-kube-api-access-h2sj4\") pod \"certified-operators-2bgnl\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.402133 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7kxq7"] Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.403321 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.406567 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.408184 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7kxq7"] Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.431087 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9f70c5-a35e-43e4-9b22-41a924ab19f3-catalog-content\") pod \"community-operators-7kxq7\" (UID: \"8b9f70c5-a35e-43e4-9b22-41a924ab19f3\") " pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.431207 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9f70c5-a35e-43e4-9b22-41a924ab19f3-utilities\") pod \"community-operators-7kxq7\" (UID: \"8b9f70c5-a35e-43e4-9b22-41a924ab19f3\") " pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.431268 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8g6l\" (UniqueName: \"kubernetes.io/projected/8b9f70c5-a35e-43e4-9b22-41a924ab19f3-kube-api-access-k8g6l\") pod \"community-operators-7kxq7\" (UID: \"8b9f70c5-a35e-43e4-9b22-41a924ab19f3\") " pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.525851 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.532473 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9f70c5-a35e-43e4-9b22-41a924ab19f3-catalog-content\") pod \"community-operators-7kxq7\" (UID: \"8b9f70c5-a35e-43e4-9b22-41a924ab19f3\") " pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.532517 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9f70c5-a35e-43e4-9b22-41a924ab19f3-utilities\") pod \"community-operators-7kxq7\" (UID: \"8b9f70c5-a35e-43e4-9b22-41a924ab19f3\") " pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.532557 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8g6l\" (UniqueName: \"kubernetes.io/projected/8b9f70c5-a35e-43e4-9b22-41a924ab19f3-kube-api-access-k8g6l\") pod \"community-operators-7kxq7\" (UID: \"8b9f70c5-a35e-43e4-9b22-41a924ab19f3\") " pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.533252 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9f70c5-a35e-43e4-9b22-41a924ab19f3-catalog-content\") pod \"community-operators-7kxq7\" (UID: \"8b9f70c5-a35e-43e4-9b22-41a924ab19f3\") " pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.533532 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9f70c5-a35e-43e4-9b22-41a924ab19f3-utilities\") pod \"community-operators-7kxq7\" (UID: \"8b9f70c5-a35e-43e4-9b22-41a924ab19f3\") " pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.549688 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8g6l\" (UniqueName: \"kubernetes.io/projected/8b9f70c5-a35e-43e4-9b22-41a924ab19f3-kube-api-access-k8g6l\") pod \"community-operators-7kxq7\" (UID: \"8b9f70c5-a35e-43e4-9b22-41a924ab19f3\") " pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.745250 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bgnl"] Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.752149 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:32 crc kubenswrapper[4658]: I1002 11:23:32.939758 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7kxq7"] Oct 02 11:23:33 crc kubenswrapper[4658]: E1002 11:23:33.043881 4658 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf74ad0_2d22_4e96_a77f_0df6ee38dfde.slice/crio-conmon-41210054316f9226d34fb45503320318e56266e9ec1d6a347168327580ff075a.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:23:33 crc kubenswrapper[4658]: W1002 11:23:33.067729 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b9f70c5_a35e_43e4_9b22_41a924ab19f3.slice/crio-ee45c163e4461da246e47da5ed3e3db911aee65963da757d3108ff46efcade2e WatchSource:0}: Error finding container ee45c163e4461da246e47da5ed3e3db911aee65963da757d3108ff46efcade2e: Status 404 returned error can't find the container with id ee45c163e4461da246e47da5ed3e3db911aee65963da757d3108ff46efcade2e Oct 02 11:23:33 crc kubenswrapper[4658]: I1002 11:23:33.241783 4658 generic.go:334] "Generic (PLEG): container finished" podID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerID="4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f" exitCode=0 Oct 02 11:23:33 crc kubenswrapper[4658]: I1002 11:23:33.241865 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgnl" event={"ID":"6c4f50a2-0ec0-44db-9817-8b3116a2415b","Type":"ContainerDied","Data":"4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f"} Oct 02 11:23:33 crc kubenswrapper[4658]: I1002 11:23:33.242260 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgnl" event={"ID":"6c4f50a2-0ec0-44db-9817-8b3116a2415b","Type":"ContainerStarted","Data":"a6a1449a3faf9a4547b55d926e26c947e9845f849e7ec2a6a50d0113c6bfe7ae"} Oct 02 11:23:33 crc kubenswrapper[4658]: I1002 11:23:33.248041 4658 generic.go:334] "Generic (PLEG): container finished" podID="4cf74ad0-2d22-4e96-a77f-0df6ee38dfde" containerID="41210054316f9226d34fb45503320318e56266e9ec1d6a347168327580ff075a" exitCode=0 Oct 02 11:23:33 crc kubenswrapper[4658]: I1002 11:23:33.248110 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czdjc" event={"ID":"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde","Type":"ContainerDied","Data":"41210054316f9226d34fb45503320318e56266e9ec1d6a347168327580ff075a"} Oct 02 11:23:33 crc kubenswrapper[4658]: I1002 11:23:33.253390 4658 generic.go:334] "Generic (PLEG): container finished" podID="1cab9c15-8dc5-46cf-bb34-84ea996f0cc6" containerID="50d550e8812468ef518defab69103d54ba17510fb499e49b8e9f0a498291e033" exitCode=0 Oct 02 11:23:33 crc kubenswrapper[4658]: I1002 11:23:33.255010 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7v4fx" event={"ID":"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6","Type":"ContainerDied","Data":"50d550e8812468ef518defab69103d54ba17510fb499e49b8e9f0a498291e033"} Oct 02 11:23:33 crc kubenswrapper[4658]: I1002 11:23:33.257943 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kxq7" event={"ID":"8b9f70c5-a35e-43e4-9b22-41a924ab19f3","Type":"ContainerStarted","Data":"ee45c163e4461da246e47da5ed3e3db911aee65963da757d3108ff46efcade2e"} Oct 02 11:23:34 crc kubenswrapper[4658]: I1002 11:23:34.266329 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7v4fx" event={"ID":"1cab9c15-8dc5-46cf-bb34-84ea996f0cc6","Type":"ContainerStarted","Data":"2ee25b3c50e6820e84e417c6fe2c68eeb0cc6cbb1f51eaf687b0f73d99733836"} Oct 02 11:23:34 crc kubenswrapper[4658]: I1002 11:23:34.268817 4658 generic.go:334] "Generic (PLEG): container finished" podID="8b9f70c5-a35e-43e4-9b22-41a924ab19f3" containerID="5d5fd367f7d15721d0022010969ad4f96bc63bfa827ee1e1c63a2c13c0a12547" exitCode=0 Oct 02 11:23:34 crc kubenswrapper[4658]: I1002 11:23:34.268904 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kxq7" event={"ID":"8b9f70c5-a35e-43e4-9b22-41a924ab19f3","Type":"ContainerDied","Data":"5d5fd367f7d15721d0022010969ad4f96bc63bfa827ee1e1c63a2c13c0a12547"} Oct 02 11:23:34 crc kubenswrapper[4658]: I1002 11:23:34.271792 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czdjc" event={"ID":"4cf74ad0-2d22-4e96-a77f-0df6ee38dfde","Type":"ContainerStarted","Data":"ecd5be8ad173113090607bf91a3d91167c3fe84b1805c5e73a502bb531b88fcd"} Oct 02 11:23:34 crc kubenswrapper[4658]: I1002 11:23:34.286145 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7v4fx" podStartSLOduration=2.622500449 podStartE2EDuration="5.286129999s" podCreationTimestamp="2025-10-02 11:23:29 +0000 UTC" firstStartedPulling="2025-10-02 11:23:31.22897728 +0000 UTC m=+292.120130847" lastFinishedPulling="2025-10-02 11:23:33.89260683 +0000 UTC m=+294.783760397" observedRunningTime="2025-10-02 11:23:34.281395697 +0000 UTC m=+295.172549264" watchObservedRunningTime="2025-10-02 11:23:34.286129999 +0000 UTC m=+295.177283566" Oct 02 11:23:34 crc kubenswrapper[4658]: I1002 11:23:34.317482 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-czdjc" podStartSLOduration=2.4413254970000002 podStartE2EDuration="5.317467751s" podCreationTimestamp="2025-10-02 11:23:29 +0000 UTC" firstStartedPulling="2025-10-02 11:23:31.231090981 +0000 UTC m=+292.122244548" lastFinishedPulling="2025-10-02 11:23:34.107233235 +0000 UTC m=+294.998386802" observedRunningTime="2025-10-02 11:23:34.315932458 +0000 UTC m=+295.207086025" watchObservedRunningTime="2025-10-02 11:23:34.317467751 +0000 UTC m=+295.208621318" Oct 02 11:23:35 crc kubenswrapper[4658]: I1002 11:23:35.277221 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kxq7" event={"ID":"8b9f70c5-a35e-43e4-9b22-41a924ab19f3","Type":"ContainerStarted","Data":"315807de5ab9fde9d4b474aae17143793a8afde7d3bc8247b6e3c656ad6ebed4"} Oct 02 11:23:35 crc kubenswrapper[4658]: I1002 11:23:35.279347 4658 generic.go:334] "Generic (PLEG): container finished" podID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerID="ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690" exitCode=0 Oct 02 11:23:35 crc kubenswrapper[4658]: I1002 11:23:35.279595 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgnl" event={"ID":"6c4f50a2-0ec0-44db-9817-8b3116a2415b","Type":"ContainerDied","Data":"ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690"} Oct 02 11:23:36 crc kubenswrapper[4658]: I1002 11:23:36.286046 4658 generic.go:334] "Generic (PLEG): container finished" podID="8b9f70c5-a35e-43e4-9b22-41a924ab19f3" containerID="315807de5ab9fde9d4b474aae17143793a8afde7d3bc8247b6e3c656ad6ebed4" exitCode=0 Oct 02 11:23:36 crc kubenswrapper[4658]: I1002 11:23:36.286688 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kxq7" event={"ID":"8b9f70c5-a35e-43e4-9b22-41a924ab19f3","Type":"ContainerDied","Data":"315807de5ab9fde9d4b474aae17143793a8afde7d3bc8247b6e3c656ad6ebed4"} Oct 02 11:23:37 crc kubenswrapper[4658]: I1002 11:23:37.293923 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kxq7" event={"ID":"8b9f70c5-a35e-43e4-9b22-41a924ab19f3","Type":"ContainerStarted","Data":"f4444f74a13d60a37d9905d0714c6c0549a414b9e9b6058010d8f95917c198cb"} Oct 02 11:23:37 crc kubenswrapper[4658]: I1002 11:23:37.296245 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgnl" event={"ID":"6c4f50a2-0ec0-44db-9817-8b3116a2415b","Type":"ContainerStarted","Data":"30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b"} Oct 02 11:23:37 crc kubenswrapper[4658]: I1002 11:23:37.309879 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7kxq7" podStartSLOduration=2.762315178 podStartE2EDuration="5.309867424s" podCreationTimestamp="2025-10-02 11:23:32 +0000 UTC" firstStartedPulling="2025-10-02 11:23:34.270426426 +0000 UTC m=+295.161579993" lastFinishedPulling="2025-10-02 11:23:36.817978672 +0000 UTC m=+297.709132239" observedRunningTime="2025-10-02 11:23:37.308690304 +0000 UTC m=+298.199843871" watchObservedRunningTime="2025-10-02 11:23:37.309867424 +0000 UTC m=+298.201020991" Oct 02 11:23:40 crc kubenswrapper[4658]: I1002 11:23:40.127900 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:40 crc kubenswrapper[4658]: I1002 11:23:40.128257 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:40 crc kubenswrapper[4658]: I1002 11:23:40.200219 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:40 crc kubenswrapper[4658]: I1002 11:23:40.226747 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2bgnl" podStartSLOduration=5.730696339 podStartE2EDuration="8.226723638s" podCreationTimestamp="2025-10-02 11:23:32 +0000 UTC" firstStartedPulling="2025-10-02 11:23:33.246810622 +0000 UTC m=+294.137964229" lastFinishedPulling="2025-10-02 11:23:35.742837961 +0000 UTC m=+296.633991528" observedRunningTime="2025-10-02 11:23:37.328279028 +0000 UTC m=+298.219432595" watchObservedRunningTime="2025-10-02 11:23:40.226723638 +0000 UTC m=+301.117877235" Oct 02 11:23:40 crc kubenswrapper[4658]: I1002 11:23:40.325696 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:40 crc kubenswrapper[4658]: I1002 11:23:40.325886 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:40 crc kubenswrapper[4658]: I1002 11:23:40.367850 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7v4fx" Oct 02 11:23:40 crc kubenswrapper[4658]: I1002 11:23:40.368640 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:41 crc kubenswrapper[4658]: I1002 11:23:41.178043 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:23:41 crc kubenswrapper[4658]: I1002 11:23:41.357846 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-czdjc" Oct 02 11:23:42 crc kubenswrapper[4658]: I1002 11:23:42.526697 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:42 crc kubenswrapper[4658]: I1002 11:23:42.526753 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:42 crc kubenswrapper[4658]: I1002 11:23:42.563011 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:42 crc kubenswrapper[4658]: I1002 11:23:42.753165 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:42 crc kubenswrapper[4658]: I1002 11:23:42.753532 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:42 crc kubenswrapper[4658]: I1002 11:23:42.788876 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:23:43 crc kubenswrapper[4658]: I1002 11:23:43.377545 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 11:23:43 crc kubenswrapper[4658]: I1002 11:23:43.409608 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7kxq7" Oct 02 11:24:27 crc kubenswrapper[4658]: I1002 11:24:27.430375 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:24:27 crc kubenswrapper[4658]: I1002 11:24:27.431582 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:24:57 crc kubenswrapper[4658]: I1002 11:24:57.429606 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:24:57 crc kubenswrapper[4658]: I1002 11:24:57.430108 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:25:27 crc kubenswrapper[4658]: I1002 11:25:27.429573 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:25:27 crc kubenswrapper[4658]: I1002 11:25:27.430137 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:25:27 crc kubenswrapper[4658]: I1002 11:25:27.430200 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:25:27 crc kubenswrapper[4658]: I1002 11:25:27.430682 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e1ee24640cae00955a1eb1c09dda3a8adfd0722fb6bd8f5d27b0d22a6570dc7"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:25:27 crc kubenswrapper[4658]: I1002 11:25:27.430741 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://6e1ee24640cae00955a1eb1c09dda3a8adfd0722fb6bd8f5d27b0d22a6570dc7" gracePeriod=600 Oct 02 11:25:27 crc kubenswrapper[4658]: I1002 11:25:27.957130 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="6e1ee24640cae00955a1eb1c09dda3a8adfd0722fb6bd8f5d27b0d22a6570dc7" exitCode=0 Oct 02 11:25:27 crc kubenswrapper[4658]: I1002 11:25:27.957203 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"6e1ee24640cae00955a1eb1c09dda3a8adfd0722fb6bd8f5d27b0d22a6570dc7"} Oct 02 11:25:27 crc kubenswrapper[4658]: I1002 11:25:27.957560 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"5bc01ce3e07d1b6a10970b2c99f735c11379957461a2c770db550fe5be4c1278"} Oct 02 11:25:27 crc kubenswrapper[4658]: I1002 11:25:27.957592 4658 scope.go:117] "RemoveContainer" containerID="058b1729c4fd0dfeb6c1960aa473b9ace64d78f25d003848fc4980ef1a9971b7" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.111411 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2c9f2"] Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.112580 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.128376 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2c9f2"] Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.279596 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-trusted-ca\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.279658 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.279697 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp5jr\" (UniqueName: \"kubernetes.io/projected/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-kube-api-access-wp5jr\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.279768 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.279794 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-registry-certificates\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.279905 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.279984 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-registry-tls\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.280048 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-bound-sa-token\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.298708 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.381043 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-trusted-ca\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.381093 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.381118 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp5jr\" (UniqueName: \"kubernetes.io/projected/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-kube-api-access-wp5jr\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.381158 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.381176 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-registry-certificates\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.381209 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-registry-tls\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.381230 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-bound-sa-token\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.382637 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.383080 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-trusted-ca\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.383683 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-registry-certificates\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.388419 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.390146 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-registry-tls\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.406492 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp5jr\" (UniqueName: \"kubernetes.io/projected/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-kube-api-access-wp5jr\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.408059 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9c19db7-12ab-43e3-b6bf-eadd9fa5f267-bound-sa-token\") pod \"image-registry-66df7c8f76-2c9f2\" (UID: \"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267\") " pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.429847 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:39 crc kubenswrapper[4658]: I1002 11:25:39.614115 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2c9f2"] Oct 02 11:25:39 crc kubenswrapper[4658]: W1002 11:25:39.619229 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9c19db7_12ab_43e3_b6bf_eadd9fa5f267.slice/crio-30e384394b9ef7f9755fff4129e546298bd15bc39478cc1635e5814dcc25dda0 WatchSource:0}: Error finding container 30e384394b9ef7f9755fff4129e546298bd15bc39478cc1635e5814dcc25dda0: Status 404 returned error can't find the container with id 30e384394b9ef7f9755fff4129e546298bd15bc39478cc1635e5814dcc25dda0 Oct 02 11:25:40 crc kubenswrapper[4658]: I1002 11:25:40.025185 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" event={"ID":"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267","Type":"ContainerStarted","Data":"4ce07af5b5641b02dbd5c5a6a6be57c177f49fdf144eec184daad9d84feb4260"} Oct 02 11:25:40 crc kubenswrapper[4658]: I1002 11:25:40.025235 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" event={"ID":"c9c19db7-12ab-43e3-b6bf-eadd9fa5f267","Type":"ContainerStarted","Data":"30e384394b9ef7f9755fff4129e546298bd15bc39478cc1635e5814dcc25dda0"} Oct 02 11:25:40 crc kubenswrapper[4658]: I1002 11:25:40.047001 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" podStartSLOduration=1.046983192 podStartE2EDuration="1.046983192s" podCreationTimestamp="2025-10-02 11:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:25:40.046842518 +0000 UTC m=+420.937996125" watchObservedRunningTime="2025-10-02 11:25:40.046983192 +0000 UTC m=+420.938136749" Oct 02 11:25:41 crc kubenswrapper[4658]: I1002 11:25:41.031240 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:59 crc kubenswrapper[4658]: I1002 11:25:59.435971 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2c9f2" Oct 02 11:25:59 crc kubenswrapper[4658]: I1002 11:25:59.494569 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-swmbd"] Oct 02 11:26:24 crc kubenswrapper[4658]: I1002 11:26:24.540589 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" podUID="8b99dd62-8d35-4423-a53a-da7654a17fb7" containerName="registry" containerID="cri-o://420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651" gracePeriod=30 Oct 02 11:26:24 crc kubenswrapper[4658]: I1002 11:26:24.931812 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.098767 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b99dd62-8d35-4423-a53a-da7654a17fb7-ca-trust-extracted\") pod \"8b99dd62-8d35-4423-a53a-da7654a17fb7\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.099103 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8b99dd62-8d35-4423-a53a-da7654a17fb7\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.099178 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-tls\") pod \"8b99dd62-8d35-4423-a53a-da7654a17fb7\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.099258 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-certificates\") pod \"8b99dd62-8d35-4423-a53a-da7654a17fb7\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.099349 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-bound-sa-token\") pod \"8b99dd62-8d35-4423-a53a-da7654a17fb7\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.099403 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ldzb\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-kube-api-access-6ldzb\") pod \"8b99dd62-8d35-4423-a53a-da7654a17fb7\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.099475 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-trusted-ca\") pod \"8b99dd62-8d35-4423-a53a-da7654a17fb7\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.099512 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b99dd62-8d35-4423-a53a-da7654a17fb7-installation-pull-secrets\") pod \"8b99dd62-8d35-4423-a53a-da7654a17fb7\" (UID: \"8b99dd62-8d35-4423-a53a-da7654a17fb7\") " Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.100231 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8b99dd62-8d35-4423-a53a-da7654a17fb7" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.100584 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8b99dd62-8d35-4423-a53a-da7654a17fb7" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.104580 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-kube-api-access-6ldzb" (OuterVolumeSpecName: "kube-api-access-6ldzb") pod "8b99dd62-8d35-4423-a53a-da7654a17fb7" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7"). InnerVolumeSpecName "kube-api-access-6ldzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.104608 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b99dd62-8d35-4423-a53a-da7654a17fb7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8b99dd62-8d35-4423-a53a-da7654a17fb7" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.104857 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8b99dd62-8d35-4423-a53a-da7654a17fb7" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.106878 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8b99dd62-8d35-4423-a53a-da7654a17fb7" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.109818 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8b99dd62-8d35-4423-a53a-da7654a17fb7" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.120222 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b99dd62-8d35-4423-a53a-da7654a17fb7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8b99dd62-8d35-4423-a53a-da7654a17fb7" (UID: "8b99dd62-8d35-4423-a53a-da7654a17fb7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.201570 4658 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b99dd62-8d35-4423-a53a-da7654a17fb7-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.201674 4658 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b99dd62-8d35-4423-a53a-da7654a17fb7-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.201715 4658 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.201726 4658 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.201741 4658 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.201755 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ldzb\" (UniqueName: \"kubernetes.io/projected/8b99dd62-8d35-4423-a53a-da7654a17fb7-kube-api-access-6ldzb\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.201790 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b99dd62-8d35-4423-a53a-da7654a17fb7-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.282738 4658 generic.go:334] "Generic (PLEG): container finished" podID="8b99dd62-8d35-4423-a53a-da7654a17fb7" containerID="420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651" exitCode=0 Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.282777 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" event={"ID":"8b99dd62-8d35-4423-a53a-da7654a17fb7","Type":"ContainerDied","Data":"420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651"} Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.282805 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" event={"ID":"8b99dd62-8d35-4423-a53a-da7654a17fb7","Type":"ContainerDied","Data":"bec11d2ac06f6f43e3828de35ec55e1330d6571e922379f90e1da1b3a83c5c37"} Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.282821 4658 scope.go:117] "RemoveContainer" containerID="420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.282845 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-swmbd" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.303789 4658 scope.go:117] "RemoveContainer" containerID="420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651" Oct 02 11:26:25 crc kubenswrapper[4658]: E1002 11:26:25.304411 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651\": container with ID starting with 420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651 not found: ID does not exist" containerID="420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.304439 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651"} err="failed to get container status \"420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651\": rpc error: code = NotFound desc = could not find container \"420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651\": container with ID starting with 420a8ab49c9613163b2edd0de03cfbfaa6b61b176f4d9d61bd8267ec6da99651 not found: ID does not exist" Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.316543 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-swmbd"] Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.320037 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-swmbd"] Oct 02 11:26:25 crc kubenswrapper[4658]: I1002 11:26:25.961815 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b99dd62-8d35-4423-a53a-da7654a17fb7" path="/var/lib/kubelet/pods/8b99dd62-8d35-4423-a53a-da7654a17fb7/volumes" Oct 02 11:27:27 crc kubenswrapper[4658]: I1002 11:27:27.430028 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:27:27 crc kubenswrapper[4658]: I1002 11:27:27.430561 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:27:57 crc kubenswrapper[4658]: I1002 11:27:57.430418 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:27:57 crc kubenswrapper[4658]: I1002 11:27:57.431067 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:28:27 crc kubenswrapper[4658]: I1002 11:28:27.429733 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:28:27 crc kubenswrapper[4658]: I1002 11:28:27.431366 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:28:27 crc kubenswrapper[4658]: I1002 11:28:27.431485 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:28:27 crc kubenswrapper[4658]: I1002 11:28:27.432105 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5bc01ce3e07d1b6a10970b2c99f735c11379957461a2c770db550fe5be4c1278"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:28:27 crc kubenswrapper[4658]: I1002 11:28:27.432244 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://5bc01ce3e07d1b6a10970b2c99f735c11379957461a2c770db550fe5be4c1278" gracePeriod=600 Oct 02 11:28:27 crc kubenswrapper[4658]: I1002 11:28:27.978891 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="5bc01ce3e07d1b6a10970b2c99f735c11379957461a2c770db550fe5be4c1278" exitCode=0 Oct 02 11:28:27 crc kubenswrapper[4658]: I1002 11:28:27.978959 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"5bc01ce3e07d1b6a10970b2c99f735c11379957461a2c770db550fe5be4c1278"} Oct 02 11:28:27 crc kubenswrapper[4658]: I1002 11:28:27.979335 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"2bbea38b7c4b625206d3cc6d00d2f3c0a2ccd06911eb1caf35974de1edfbf91d"} Oct 02 11:28:27 crc kubenswrapper[4658]: I1002 11:28:27.979388 4658 scope.go:117] "RemoveContainer" containerID="6e1ee24640cae00955a1eb1c09dda3a8adfd0722fb6bd8f5d27b0d22a6570dc7" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.189345 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-4jlqn"] Oct 02 11:28:57 crc kubenswrapper[4658]: E1002 11:28:57.190202 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b99dd62-8d35-4423-a53a-da7654a17fb7" containerName="registry" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.190251 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b99dd62-8d35-4423-a53a-da7654a17fb7" containerName="registry" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.190390 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b99dd62-8d35-4423-a53a-da7654a17fb7" containerName="registry" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.190856 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-4jlqn" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.192808 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-cn57q"] Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.193617 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-cn57q" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.194369 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.194615 4658 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2vwtr" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.194771 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.200372 4658 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zpn26" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.205883 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-cn57q"] Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.212053 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-4jlqn"] Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.216350 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-88pc4"] Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.217221 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.220948 4658 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-q4xjw" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.238227 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-88pc4"] Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.245041 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxg8w\" (UniqueName: \"kubernetes.io/projected/648c22f9-bc82-4a6a-9b68-b9b557f0c243-kube-api-access-gxg8w\") pod \"cert-manager-webhook-5655c58dd6-88pc4\" (UID: \"648c22f9-bc82-4a6a-9b68-b9b557f0c243\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.245100 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd78n\" (UniqueName: \"kubernetes.io/projected/329487df-e7b0-4925-8c85-155c96453929-kube-api-access-vd78n\") pod \"cert-manager-cainjector-7f985d654d-4jlqn\" (UID: \"329487df-e7b0-4925-8c85-155c96453929\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-4jlqn" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.245130 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt8sq\" (UniqueName: \"kubernetes.io/projected/2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf-kube-api-access-dt8sq\") pod \"cert-manager-5b446d88c5-cn57q\" (UID: \"2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf\") " pod="cert-manager/cert-manager-5b446d88c5-cn57q" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.346550 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxg8w\" (UniqueName: \"kubernetes.io/projected/648c22f9-bc82-4a6a-9b68-b9b557f0c243-kube-api-access-gxg8w\") pod \"cert-manager-webhook-5655c58dd6-88pc4\" (UID: \"648c22f9-bc82-4a6a-9b68-b9b557f0c243\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.346616 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd78n\" (UniqueName: \"kubernetes.io/projected/329487df-e7b0-4925-8c85-155c96453929-kube-api-access-vd78n\") pod \"cert-manager-cainjector-7f985d654d-4jlqn\" (UID: \"329487df-e7b0-4925-8c85-155c96453929\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-4jlqn" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.346653 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt8sq\" (UniqueName: \"kubernetes.io/projected/2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf-kube-api-access-dt8sq\") pod \"cert-manager-5b446d88c5-cn57q\" (UID: \"2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf\") " pod="cert-manager/cert-manager-5b446d88c5-cn57q" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.365627 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxg8w\" (UniqueName: \"kubernetes.io/projected/648c22f9-bc82-4a6a-9b68-b9b557f0c243-kube-api-access-gxg8w\") pod \"cert-manager-webhook-5655c58dd6-88pc4\" (UID: \"648c22f9-bc82-4a6a-9b68-b9b557f0c243\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.365687 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd78n\" (UniqueName: \"kubernetes.io/projected/329487df-e7b0-4925-8c85-155c96453929-kube-api-access-vd78n\") pod \"cert-manager-cainjector-7f985d654d-4jlqn\" (UID: \"329487df-e7b0-4925-8c85-155c96453929\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-4jlqn" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.365714 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt8sq\" (UniqueName: \"kubernetes.io/projected/2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf-kube-api-access-dt8sq\") pod \"cert-manager-5b446d88c5-cn57q\" (UID: \"2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf\") " pod="cert-manager/cert-manager-5b446d88c5-cn57q" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.522265 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-4jlqn" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.538030 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-cn57q" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.546849 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.764725 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-4jlqn"] Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.779378 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.813796 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-cn57q"] Oct 02 11:28:57 crc kubenswrapper[4658]: W1002 11:28:57.820287 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e28d2d3_12b8_490d_a3f6_6e88c19e4cdf.slice/crio-72aaf82db9235bad226c15be1ddc017a0d90b240dbc58833a0dbf7cb7c3e9156 WatchSource:0}: Error finding container 72aaf82db9235bad226c15be1ddc017a0d90b240dbc58833a0dbf7cb7c3e9156: Status 404 returned error can't find the container with id 72aaf82db9235bad226c15be1ddc017a0d90b240dbc58833a0dbf7cb7c3e9156 Oct 02 11:28:57 crc kubenswrapper[4658]: I1002 11:28:57.853113 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-88pc4"] Oct 02 11:28:58 crc kubenswrapper[4658]: I1002 11:28:58.152101 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" event={"ID":"648c22f9-bc82-4a6a-9b68-b9b557f0c243","Type":"ContainerStarted","Data":"aa0b636f8560ede21b173409d900b4328bcf2a3dbc0e73ed3944d83749b8f470"} Oct 02 11:28:58 crc kubenswrapper[4658]: I1002 11:28:58.153820 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-cn57q" event={"ID":"2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf","Type":"ContainerStarted","Data":"72aaf82db9235bad226c15be1ddc017a0d90b240dbc58833a0dbf7cb7c3e9156"} Oct 02 11:28:58 crc kubenswrapper[4658]: I1002 11:28:58.155728 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-4jlqn" event={"ID":"329487df-e7b0-4925-8c85-155c96453929","Type":"ContainerStarted","Data":"7bfbf6cca8f678d945a4562b4a61a63eb8fe287ba396071d12b816cbad158bd9"} Oct 02 11:29:02 crc kubenswrapper[4658]: I1002 11:29:02.182788 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" event={"ID":"648c22f9-bc82-4a6a-9b68-b9b557f0c243","Type":"ContainerStarted","Data":"13c2bb01efeabd7da710fa1e080631fb8883a7fa5be490a550085997647dbd18"} Oct 02 11:29:02 crc kubenswrapper[4658]: I1002 11:29:02.183790 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" Oct 02 11:29:02 crc kubenswrapper[4658]: I1002 11:29:02.184541 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-cn57q" event={"ID":"2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf","Type":"ContainerStarted","Data":"759654afcce2fb7ca518e65739f91f6527ba6e86361c97588b1e3fbacaf3a2f4"} Oct 02 11:29:02 crc kubenswrapper[4658]: I1002 11:29:02.186532 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-4jlqn" event={"ID":"329487df-e7b0-4925-8c85-155c96453929","Type":"ContainerStarted","Data":"8398facce2e18e5753125ce6c880f3494a2a3d2327ad2c5eb4cfb098abbca50e"} Oct 02 11:29:02 crc kubenswrapper[4658]: I1002 11:29:02.229098 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" podStartSLOduration=1.4761233 podStartE2EDuration="5.229073078s" podCreationTimestamp="2025-10-02 11:28:57 +0000 UTC" firstStartedPulling="2025-10-02 11:28:57.859467387 +0000 UTC m=+618.750620954" lastFinishedPulling="2025-10-02 11:29:01.612417155 +0000 UTC m=+622.503570732" observedRunningTime="2025-10-02 11:29:02.215226513 +0000 UTC m=+623.106380080" watchObservedRunningTime="2025-10-02 11:29:02.229073078 +0000 UTC m=+623.120226645" Oct 02 11:29:02 crc kubenswrapper[4658]: I1002 11:29:02.229214 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-cn57q" podStartSLOduration=1.447929646 podStartE2EDuration="5.229210402s" podCreationTimestamp="2025-10-02 11:28:57 +0000 UTC" firstStartedPulling="2025-10-02 11:28:57.823321407 +0000 UTC m=+618.714474964" lastFinishedPulling="2025-10-02 11:29:01.604602113 +0000 UTC m=+622.495755720" observedRunningTime="2025-10-02 11:29:02.228363966 +0000 UTC m=+623.119517523" watchObservedRunningTime="2025-10-02 11:29:02.229210402 +0000 UTC m=+623.120363979" Oct 02 11:29:02 crc kubenswrapper[4658]: I1002 11:29:02.243054 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-4jlqn" podStartSLOduration=1.343214513 podStartE2EDuration="5.243035626s" podCreationTimestamp="2025-10-02 11:28:57 +0000 UTC" firstStartedPulling="2025-10-02 11:28:57.779103867 +0000 UTC m=+618.670257434" lastFinishedPulling="2025-10-02 11:29:01.67892495 +0000 UTC m=+622.570078547" observedRunningTime="2025-10-02 11:29:02.24187665 +0000 UTC m=+623.133030267" watchObservedRunningTime="2025-10-02 11:29:02.243035626 +0000 UTC m=+623.134189193" Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.430042 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2t8w8"] Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.430956 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovn-controller" containerID="cri-o://fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2" gracePeriod=30 Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.431363 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="sbdb" containerID="cri-o://274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1" gracePeriod=30 Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.431446 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc" gracePeriod=30 Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.431581 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="nbdb" containerID="cri-o://fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9" gracePeriod=30 Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.431634 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="northd" containerID="cri-o://54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02" gracePeriod=30 Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.431673 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovn-acl-logging" containerID="cri-o://8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da" gracePeriod=30 Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.431706 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kube-rbac-proxy-node" containerID="cri-o://3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6" gracePeriod=30 Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.479577 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" containerID="cri-o://6a0357622298d3d3fe3388b77219e229e40bef5c13d2a18b87c4c843459c761d" gracePeriod=30 Oct 02 11:29:07 crc kubenswrapper[4658]: I1002 11:29:07.549169 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-88pc4" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.232890 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/2.log" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.234152 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/1.log" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.234224 4658 generic.go:334] "Generic (PLEG): container finished" podID="69a005aa-c7db-4d46-968b-8a9a0c00bbd5" containerID="f04b87c43afe012e11419112bd1a2b96826666a7720fc6cef90e8211df145006" exitCode=2 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.234379 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-thtgx" event={"ID":"69a005aa-c7db-4d46-968b-8a9a0c00bbd5","Type":"ContainerDied","Data":"f04b87c43afe012e11419112bd1a2b96826666a7720fc6cef90e8211df145006"} Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.234469 4658 scope.go:117] "RemoveContainer" containerID="96d2c86a51c49a5e3a2fb2686f153767ef3ea30df91f6a14542a83682e5923c5" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.235149 4658 scope.go:117] "RemoveContainer" containerID="f04b87c43afe012e11419112bd1a2b96826666a7720fc6cef90e8211df145006" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.235450 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-thtgx_openshift-multus(69a005aa-c7db-4d46-968b-8a9a0c00bbd5)\"" pod="openshift-multus/multus-thtgx" podUID="69a005aa-c7db-4d46-968b-8a9a0c00bbd5" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.239370 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovnkube-controller/3.log" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.242739 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovn-acl-logging/0.log" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243223 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovn-controller/0.log" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243590 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="6a0357622298d3d3fe3388b77219e229e40bef5c13d2a18b87c4c843459c761d" exitCode=0 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243615 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1" exitCode=0 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243625 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9" exitCode=0 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243633 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02" exitCode=0 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243640 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc" exitCode=0 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243648 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6" exitCode=0 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243656 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da" exitCode=143 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243665 4658 generic.go:334] "Generic (PLEG): container finished" podID="dea12458-2637-446e-b388-4f139b3fd000" containerID="fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2" exitCode=143 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243688 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"6a0357622298d3d3fe3388b77219e229e40bef5c13d2a18b87c4c843459c761d"} Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243712 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1"} Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243723 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9"} Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243731 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02"} Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243741 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc"} Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243750 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6"} Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243758 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da"} Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.243766 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2"} Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.292658 4658 scope.go:117] "RemoveContainer" containerID="8c4f9e9aed412fcbc8b196dc9560e35d990f7bd961ec69f0a6eb1ef47d9e1023" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.486628 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovn-acl-logging/0.log" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.487242 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2t8w8_dea12458-2637-446e-b388-4f139b3fd000/ovn-controller/0.log" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.487833 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570210 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mshsz"] Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570412 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570427 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570438 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="sbdb" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570444 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="sbdb" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570450 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kube-rbac-proxy-node" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570456 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kube-rbac-proxy-node" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570465 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="nbdb" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570471 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="nbdb" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570482 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570487 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570494 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570499 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570508 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovn-acl-logging" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570513 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovn-acl-logging" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570520 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kubecfg-setup" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570526 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kubecfg-setup" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570533 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570539 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570545 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="northd" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570551 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="northd" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570557 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570563 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570570 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovn-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570575 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovn-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570656 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovn-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570668 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570676 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovn-acl-logging" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570683 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570692 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kube-rbac-proxy-node" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570699 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570706 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="nbdb" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570716 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="sbdb" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570724 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="northd" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570731 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:29:08 crc kubenswrapper[4658]: E1002 11:29:08.570811 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570818 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570908 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.570920 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea12458-2637-446e-b388-4f139b3fd000" containerName="ovnkube-controller" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.572340 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.628747 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-config\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.628839 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dea12458-2637-446e-b388-4f139b3fd000-ovn-node-metrics-cert\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.628878 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-node-log\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.628908 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-bin\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.628944 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-etc-openvswitch\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.628976 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-openvswitch\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629012 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-systemd\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629041 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-kubelet\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629081 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-ovn-kubernetes\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629118 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-env-overrides\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629165 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-var-lib-cni-networks-ovn-kubernetes\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629207 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-slash\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629250 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8hnd\" (UniqueName: \"kubernetes.io/projected/dea12458-2637-446e-b388-4f139b3fd000-kube-api-access-b8hnd\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629345 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-netd\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629714 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-systemd-units\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629877 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-script-lib\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.630586 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-var-lib-openvswitch\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629460 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629467 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629505 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629512 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-node-log" (OuterVolumeSpecName: "node-log") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629530 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-slash" (OuterVolumeSpecName: "host-slash") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629524 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629594 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629617 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629636 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629787 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.629926 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.630041 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.630494 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.630659 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.630737 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-log-socket\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.630974 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-log-socket" (OuterVolumeSpecName: "log-socket") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631002 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-ovn\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631027 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631033 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-netns\") pod \"dea12458-2637-446e-b388-4f139b3fd000\" (UID: \"dea12458-2637-446e-b388-4f139b3fd000\") " Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631114 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631327 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e2ebb6-5dec-4248-bde9-51d6233f816c-ovnkube-config\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631378 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-cni-bin\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631469 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-etc-openvswitch\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631528 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2znz\" (UniqueName: \"kubernetes.io/projected/43e2ebb6-5dec-4248-bde9-51d6233f816c-kube-api-access-w2znz\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631587 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631632 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-run-netns\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631674 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-slash\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631785 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-run-ovn\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.631883 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-cni-netd\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.632001 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-node-log\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.632084 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-run-openvswitch\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.632189 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-var-lib-openvswitch\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.632359 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e2ebb6-5dec-4248-bde9-51d6233f816c-ovn-node-metrics-cert\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.632467 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.632574 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-run-systemd\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.632647 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-kubelet\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.632735 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-log-socket\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.632826 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e2ebb6-5dec-4248-bde9-51d6233f816c-ovnkube-script-lib\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633025 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e2ebb6-5dec-4248-bde9-51d6233f816c-env-overrides\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633134 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-systemd-units\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633280 4658 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633543 4658 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-slash\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633606 4658 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633626 4658 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633642 4658 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633658 4658 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633674 4658 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-log-socket\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633690 4658 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633706 4658 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633726 4658 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633760 4658 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-node-log\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633778 4658 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633794 4658 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633810 4658 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633825 4658 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633843 4658 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.633860 4658 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dea12458-2637-446e-b388-4f139b3fd000-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.635768 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea12458-2637-446e-b388-4f139b3fd000-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.643886 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.643886 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea12458-2637-446e-b388-4f139b3fd000-kube-api-access-b8hnd" (OuterVolumeSpecName: "kube-api-access-b8hnd") pod "dea12458-2637-446e-b388-4f139b3fd000" (UID: "dea12458-2637-446e-b388-4f139b3fd000"). InnerVolumeSpecName "kube-api-access-b8hnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734755 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-node-log\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734803 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-run-openvswitch\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734827 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-var-lib-openvswitch\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734854 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e2ebb6-5dec-4248-bde9-51d6233f816c-ovn-node-metrics-cert\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734875 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734891 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-run-systemd\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734909 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-kubelet\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734927 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-log-socket\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734943 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e2ebb6-5dec-4248-bde9-51d6233f816c-ovnkube-script-lib\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734952 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-var-lib-openvswitch\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734986 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-kubelet\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734965 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e2ebb6-5dec-4248-bde9-51d6233f816c-env-overrides\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735029 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735021 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-log-socket\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735070 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-systemd-units\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735047 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-run-systemd\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734958 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-run-openvswitch\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735120 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e2ebb6-5dec-4248-bde9-51d6233f816c-ovnkube-config\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735146 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-cni-bin\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735167 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-etc-openvswitch\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735200 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2znz\" (UniqueName: \"kubernetes.io/projected/43e2ebb6-5dec-4248-bde9-51d6233f816c-kube-api-access-w2znz\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735229 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-cni-bin\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735241 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735267 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-run-netns\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735267 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-etc-openvswitch\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735320 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-slash\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735323 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735367 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-run-ovn\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735386 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-cni-netd\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735415 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-run-netns\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735119 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-systemd-units\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.734903 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-node-log\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735494 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-run-ovn\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735480 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-slash\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735517 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e2ebb6-5dec-4248-bde9-51d6233f816c-host-cni-netd\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735553 4658 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dea12458-2637-446e-b388-4f139b3fd000-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735567 4658 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dea12458-2637-446e-b388-4f139b3fd000-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735581 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8hnd\" (UniqueName: \"kubernetes.io/projected/dea12458-2637-446e-b388-4f139b3fd000-kube-api-access-b8hnd\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.735780 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e2ebb6-5dec-4248-bde9-51d6233f816c-env-overrides\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.736102 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e2ebb6-5dec-4248-bde9-51d6233f816c-ovnkube-script-lib\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.736131 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e2ebb6-5dec-4248-bde9-51d6233f816c-ovnkube-config\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.737944 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e2ebb6-5dec-4248-bde9-51d6233f816c-ovn-node-metrics-cert\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.758099 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2znz\" (UniqueName: \"kubernetes.io/projected/43e2ebb6-5dec-4248-bde9-51d6233f816c-kube-api-access-w2znz\") pod \"ovnkube-node-mshsz\" (UID: \"43e2ebb6-5dec-4248-bde9-51d6233f816c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.826424 4658 scope.go:117] "RemoveContainer" containerID="fba0961086c42ead8b5ec887cfa30de6b9e90ab16d80e13fa5b2e35680365cd9" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.846772 4658 scope.go:117] "RemoveContainer" containerID="3756ff9b631f8410ad417997b7938531b4e29b3a4d3bd9d49c227fa00608d0c6" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.862194 4658 scope.go:117] "RemoveContainer" containerID="6a0357622298d3d3fe3388b77219e229e40bef5c13d2a18b87c4c843459c761d" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.880274 4658 scope.go:117] "RemoveContainer" containerID="d320d1b2557399c5ca618987f955c60703da9d9c7a50065576c92314312ed6cc" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.884902 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.899318 4658 scope.go:117] "RemoveContainer" containerID="d5ddd7eb508091f90fbfd5116446e91776ceb59a11cd6d4932ee3880e0633d60" Oct 02 11:29:08 crc kubenswrapper[4658]: W1002 11:29:08.908019 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e2ebb6_5dec_4248_bde9_51d6233f816c.slice/crio-aafc803526482c04584bb036a0802ac28564ef2d4014b444e1b0124ed20b0752 WatchSource:0}: Error finding container aafc803526482c04584bb036a0802ac28564ef2d4014b444e1b0124ed20b0752: Status 404 returned error can't find the container with id aafc803526482c04584bb036a0802ac28564ef2d4014b444e1b0124ed20b0752 Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.917448 4658 scope.go:117] "RemoveContainer" containerID="274e671156b5a380b5cbd507eb420bec31ebff2b7b99179123dbff544200fce1" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.933199 4658 scope.go:117] "RemoveContainer" containerID="54ff86ee01e2dc79b58cd23fc428c11c6d91cc2bbd68ab6778508b7ab5b41c02" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.945430 4658 scope.go:117] "RemoveContainer" containerID="fd2d4fe2c18960ffe2e8af74162db4573f9e057a8a8710c110cc414c062f26f2" Oct 02 11:29:08 crc kubenswrapper[4658]: I1002 11:29:08.961387 4658 scope.go:117] "RemoveContainer" containerID="8113b36a9749f21d5b074865fe3d9b51c6ac0b92897174fbaa6b802c5ee434da" Oct 02 11:29:09 crc kubenswrapper[4658]: I1002 11:29:09.251522 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/2.log" Oct 02 11:29:09 crc kubenswrapper[4658]: I1002 11:29:09.253138 4658 generic.go:334] "Generic (PLEG): container finished" podID="43e2ebb6-5dec-4248-bde9-51d6233f816c" containerID="a17bce11b2db53332d619b5c8dc24eac2dd7395666a19b9d22dad24ceeca0f61" exitCode=0 Oct 02 11:29:09 crc kubenswrapper[4658]: I1002 11:29:09.253228 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" Oct 02 11:29:09 crc kubenswrapper[4658]: I1002 11:29:09.253223 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerDied","Data":"a17bce11b2db53332d619b5c8dc24eac2dd7395666a19b9d22dad24ceeca0f61"} Oct 02 11:29:09 crc kubenswrapper[4658]: I1002 11:29:09.253584 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerStarted","Data":"aafc803526482c04584bb036a0802ac28564ef2d4014b444e1b0124ed20b0752"} Oct 02 11:29:09 crc kubenswrapper[4658]: I1002 11:29:09.253704 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2t8w8" event={"ID":"dea12458-2637-446e-b388-4f139b3fd000","Type":"ContainerDied","Data":"99952b290a04ac2328b8df7609f76d5d287fa5ccad3f5e0120a0de11aadaf9b9"} Oct 02 11:29:09 crc kubenswrapper[4658]: I1002 11:29:09.323878 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2t8w8"] Oct 02 11:29:09 crc kubenswrapper[4658]: I1002 11:29:09.326544 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2t8w8"] Oct 02 11:29:09 crc kubenswrapper[4658]: I1002 11:29:09.955838 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dea12458-2637-446e-b388-4f139b3fd000" path="/var/lib/kubelet/pods/dea12458-2637-446e-b388-4f139b3fd000/volumes" Oct 02 11:29:10 crc kubenswrapper[4658]: I1002 11:29:10.261213 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerStarted","Data":"cbc1293d46de983cf8347fcfd67289c998104d5359e6bc5af7a3aea3bba70551"} Oct 02 11:29:10 crc kubenswrapper[4658]: I1002 11:29:10.261254 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerStarted","Data":"5f1759193362ffa8cdc2c6fa46f7301f322edbc4e9187679725005be8209cb42"} Oct 02 11:29:10 crc kubenswrapper[4658]: I1002 11:29:10.261264 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerStarted","Data":"6a87f58ec2d6550f32d0fabc819d50e12cf122a527d0fae9dd84fd04a3e53a57"} Oct 02 11:29:10 crc kubenswrapper[4658]: I1002 11:29:10.261273 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerStarted","Data":"d11d32cce6ca93739e17150b57afabb20523bb36b291e786019f570a6f1e95c2"} Oct 02 11:29:10 crc kubenswrapper[4658]: I1002 11:29:10.261282 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerStarted","Data":"c269cd6e4fb523cd6e4ec18df144278b33809fed335f707ccfe5aad9e980843c"} Oct 02 11:29:10 crc kubenswrapper[4658]: I1002 11:29:10.261303 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerStarted","Data":"c2b89ec3eb30a5adefb38512539590052d02f1c9793dbb0cf9a1402873c9a7a5"} Oct 02 11:29:12 crc kubenswrapper[4658]: I1002 11:29:12.276208 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerStarted","Data":"011b303326f2d76528b0fb85cd374d3ab07828b0bbdaa0401d3d85859346a919"} Oct 02 11:29:15 crc kubenswrapper[4658]: I1002 11:29:15.297671 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" event={"ID":"43e2ebb6-5dec-4248-bde9-51d6233f816c","Type":"ContainerStarted","Data":"c47ca56c68f684ae4e829e1edd3f00a0e566fbce1c8a007b03fbabed556a381c"} Oct 02 11:29:15 crc kubenswrapper[4658]: I1002 11:29:15.298402 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:15 crc kubenswrapper[4658]: I1002 11:29:15.339031 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:15 crc kubenswrapper[4658]: I1002 11:29:15.365182 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" podStartSLOduration=7.365166868 podStartE2EDuration="7.365166868s" podCreationTimestamp="2025-10-02 11:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:29:15.332987845 +0000 UTC m=+636.224141412" watchObservedRunningTime="2025-10-02 11:29:15.365166868 +0000 UTC m=+636.256320435" Oct 02 11:29:16 crc kubenswrapper[4658]: I1002 11:29:16.303133 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:16 crc kubenswrapper[4658]: I1002 11:29:16.303554 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:16 crc kubenswrapper[4658]: I1002 11:29:16.330852 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:22 crc kubenswrapper[4658]: I1002 11:29:22.948643 4658 scope.go:117] "RemoveContainer" containerID="f04b87c43afe012e11419112bd1a2b96826666a7720fc6cef90e8211df145006" Oct 02 11:29:22 crc kubenswrapper[4658]: E1002 11:29:22.949458 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-thtgx_openshift-multus(69a005aa-c7db-4d46-968b-8a9a0c00bbd5)\"" pod="openshift-multus/multus-thtgx" podUID="69a005aa-c7db-4d46-968b-8a9a0c00bbd5" Oct 02 11:29:33 crc kubenswrapper[4658]: I1002 11:29:33.949384 4658 scope.go:117] "RemoveContainer" containerID="f04b87c43afe012e11419112bd1a2b96826666a7720fc6cef90e8211df145006" Oct 02 11:29:34 crc kubenswrapper[4658]: I1002 11:29:34.418614 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-thtgx_69a005aa-c7db-4d46-968b-8a9a0c00bbd5/kube-multus/2.log" Oct 02 11:29:34 crc kubenswrapper[4658]: I1002 11:29:34.419152 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-thtgx" event={"ID":"69a005aa-c7db-4d46-968b-8a9a0c00bbd5","Type":"ContainerStarted","Data":"f602ad93db96badc9dc8087ef6c82cd7be2449f572a985d2c785cf7cc61d0805"} Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.534929 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf"] Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.547803 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.559735 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.561419 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf"] Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.690275 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9zq6\" (UniqueName: \"kubernetes.io/projected/d3132797-270c-4510-9f55-754ad5e47f34-kube-api-access-b9zq6\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.690371 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.690460 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.792053 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9zq6\" (UniqueName: \"kubernetes.io/projected/d3132797-270c-4510-9f55-754ad5e47f34-kube-api-access-b9zq6\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.792128 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.792182 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.792701 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.792829 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.818090 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9zq6\" (UniqueName: \"kubernetes.io/projected/d3132797-270c-4510-9f55-754ad5e47f34-kube-api-access-b9zq6\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:35 crc kubenswrapper[4658]: I1002 11:29:35.868044 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:36 crc kubenswrapper[4658]: I1002 11:29:36.038666 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf"] Oct 02 11:29:36 crc kubenswrapper[4658]: I1002 11:29:36.433477 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" event={"ID":"d3132797-270c-4510-9f55-754ad5e47f34","Type":"ContainerStarted","Data":"9fcd2fbf044aaf67e372a4086a4fd757654776c6593877f0126b5f16ba374fe5"} Oct 02 11:29:36 crc kubenswrapper[4658]: I1002 11:29:36.434034 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" event={"ID":"d3132797-270c-4510-9f55-754ad5e47f34","Type":"ContainerStarted","Data":"99cdc9422976dcd12ea29f11ff0eeda3bd62d170226486a8b927d67c819becb4"} Oct 02 11:29:37 crc kubenswrapper[4658]: I1002 11:29:37.441081 4658 generic.go:334] "Generic (PLEG): container finished" podID="d3132797-270c-4510-9f55-754ad5e47f34" containerID="9fcd2fbf044aaf67e372a4086a4fd757654776c6593877f0126b5f16ba374fe5" exitCode=0 Oct 02 11:29:37 crc kubenswrapper[4658]: I1002 11:29:37.441118 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" event={"ID":"d3132797-270c-4510-9f55-754ad5e47f34","Type":"ContainerDied","Data":"9fcd2fbf044aaf67e372a4086a4fd757654776c6593877f0126b5f16ba374fe5"} Oct 02 11:29:38 crc kubenswrapper[4658]: I1002 11:29:38.906399 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mshsz" Oct 02 11:29:39 crc kubenswrapper[4658]: I1002 11:29:39.452876 4658 generic.go:334] "Generic (PLEG): container finished" podID="d3132797-270c-4510-9f55-754ad5e47f34" containerID="a463830ebb924c71fb2284091005fb4c5ff5bb346f5381327a47863b6a3c936a" exitCode=0 Oct 02 11:29:39 crc kubenswrapper[4658]: I1002 11:29:39.452917 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" event={"ID":"d3132797-270c-4510-9f55-754ad5e47f34","Type":"ContainerDied","Data":"a463830ebb924c71fb2284091005fb4c5ff5bb346f5381327a47863b6a3c936a"} Oct 02 11:29:40 crc kubenswrapper[4658]: I1002 11:29:40.461358 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" event={"ID":"d3132797-270c-4510-9f55-754ad5e47f34","Type":"ContainerStarted","Data":"dd98092f4d23aee06c833aec73a309549acf75f811310e0d9e789cdc323abd4c"} Oct 02 11:29:40 crc kubenswrapper[4658]: I1002 11:29:40.483275 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" podStartSLOduration=4.345595314 podStartE2EDuration="5.483255984s" podCreationTimestamp="2025-10-02 11:29:35 +0000 UTC" firstStartedPulling="2025-10-02 11:29:37.442886983 +0000 UTC m=+658.334040550" lastFinishedPulling="2025-10-02 11:29:38.580547653 +0000 UTC m=+659.471701220" observedRunningTime="2025-10-02 11:29:40.481933451 +0000 UTC m=+661.373087018" watchObservedRunningTime="2025-10-02 11:29:40.483255984 +0000 UTC m=+661.374409561" Oct 02 11:29:41 crc kubenswrapper[4658]: I1002 11:29:41.469774 4658 generic.go:334] "Generic (PLEG): container finished" podID="d3132797-270c-4510-9f55-754ad5e47f34" containerID="dd98092f4d23aee06c833aec73a309549acf75f811310e0d9e789cdc323abd4c" exitCode=0 Oct 02 11:29:41 crc kubenswrapper[4658]: I1002 11:29:41.469892 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" event={"ID":"d3132797-270c-4510-9f55-754ad5e47f34","Type":"ContainerDied","Data":"dd98092f4d23aee06c833aec73a309549acf75f811310e0d9e789cdc323abd4c"} Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.710333 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.781632 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-bundle\") pod \"d3132797-270c-4510-9f55-754ad5e47f34\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.781838 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-util\") pod \"d3132797-270c-4510-9f55-754ad5e47f34\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.781925 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9zq6\" (UniqueName: \"kubernetes.io/projected/d3132797-270c-4510-9f55-754ad5e47f34-kube-api-access-b9zq6\") pod \"d3132797-270c-4510-9f55-754ad5e47f34\" (UID: \"d3132797-270c-4510-9f55-754ad5e47f34\") " Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.785809 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-bundle" (OuterVolumeSpecName: "bundle") pod "d3132797-270c-4510-9f55-754ad5e47f34" (UID: "d3132797-270c-4510-9f55-754ad5e47f34"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.791057 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3132797-270c-4510-9f55-754ad5e47f34-kube-api-access-b9zq6" (OuterVolumeSpecName: "kube-api-access-b9zq6") pod "d3132797-270c-4510-9f55-754ad5e47f34" (UID: "d3132797-270c-4510-9f55-754ad5e47f34"). InnerVolumeSpecName "kube-api-access-b9zq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.792804 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-util" (OuterVolumeSpecName: "util") pod "d3132797-270c-4510-9f55-754ad5e47f34" (UID: "d3132797-270c-4510-9f55-754ad5e47f34"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.883287 4658 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.883348 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9zq6\" (UniqueName: \"kubernetes.io/projected/d3132797-270c-4510-9f55-754ad5e47f34-kube-api-access-b9zq6\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:42 crc kubenswrapper[4658]: I1002 11:29:42.883363 4658 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3132797-270c-4510-9f55-754ad5e47f34-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:43 crc kubenswrapper[4658]: I1002 11:29:43.488850 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" event={"ID":"d3132797-270c-4510-9f55-754ad5e47f34","Type":"ContainerDied","Data":"99cdc9422976dcd12ea29f11ff0eeda3bd62d170226486a8b927d67c819becb4"} Oct 02 11:29:43 crc kubenswrapper[4658]: I1002 11:29:43.488894 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99cdc9422976dcd12ea29f11ff0eeda3bd62d170226486a8b927d67c819becb4" Oct 02 11:29:43 crc kubenswrapper[4658]: I1002 11:29:43.488928 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.591161 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk"] Oct 02 11:29:52 crc kubenswrapper[4658]: E1002 11:29:52.591976 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3132797-270c-4510-9f55-754ad5e47f34" containerName="util" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.591991 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3132797-270c-4510-9f55-754ad5e47f34" containerName="util" Oct 02 11:29:52 crc kubenswrapper[4658]: E1002 11:29:52.592003 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3132797-270c-4510-9f55-754ad5e47f34" containerName="pull" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.592010 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3132797-270c-4510-9f55-754ad5e47f34" containerName="pull" Oct 02 11:29:52 crc kubenswrapper[4658]: E1002 11:29:52.592022 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3132797-270c-4510-9f55-754ad5e47f34" containerName="extract" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.592029 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3132797-270c-4510-9f55-754ad5e47f34" containerName="extract" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.592140 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3132797-270c-4510-9f55-754ad5e47f34" containerName="extract" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.592664 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.595276 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.595765 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.600010 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-szrzp" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.608855 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk"] Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.710014 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm7x6\" (UniqueName: \"kubernetes.io/projected/6abb4e77-380e-45f9-94dd-0511e0194885-kube-api-access-gm7x6\") pod \"obo-prometheus-operator-7c8cf85677-wxwrk\" (UID: \"6abb4e77-380e-45f9-94dd-0511e0194885\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.719257 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm"] Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.720171 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.724460 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-jkddk" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.724480 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.727784 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g"] Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.728490 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.736775 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm"] Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.740752 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g"] Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.811463 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c427e03-4bb9-4dc4-a866-765e097e498f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g\" (UID: \"8c427e03-4bb9-4dc4-a866-765e097e498f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.811545 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c427e03-4bb9-4dc4-a866-765e097e498f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g\" (UID: \"8c427e03-4bb9-4dc4-a866-765e097e498f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.811577 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e8c24809-b49a-4a7d-9fd8-58f83c33a290-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm\" (UID: \"e8c24809-b49a-4a7d-9fd8-58f83c33a290\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.811608 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm7x6\" (UniqueName: \"kubernetes.io/projected/6abb4e77-380e-45f9-94dd-0511e0194885-kube-api-access-gm7x6\") pod \"obo-prometheus-operator-7c8cf85677-wxwrk\" (UID: \"6abb4e77-380e-45f9-94dd-0511e0194885\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.811632 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e8c24809-b49a-4a7d-9fd8-58f83c33a290-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm\" (UID: \"e8c24809-b49a-4a7d-9fd8-58f83c33a290\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.832869 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm7x6\" (UniqueName: \"kubernetes.io/projected/6abb4e77-380e-45f9-94dd-0511e0194885-kube-api-access-gm7x6\") pod \"obo-prometheus-operator-7c8cf85677-wxwrk\" (UID: \"6abb4e77-380e-45f9-94dd-0511e0194885\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.909421 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.913161 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c427e03-4bb9-4dc4-a866-765e097e498f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g\" (UID: \"8c427e03-4bb9-4dc4-a866-765e097e498f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.913241 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c427e03-4bb9-4dc4-a866-765e097e498f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g\" (UID: \"8c427e03-4bb9-4dc4-a866-765e097e498f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.913273 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e8c24809-b49a-4a7d-9fd8-58f83c33a290-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm\" (UID: \"e8c24809-b49a-4a7d-9fd8-58f83c33a290\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.913323 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e8c24809-b49a-4a7d-9fd8-58f83c33a290-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm\" (UID: \"e8c24809-b49a-4a7d-9fd8-58f83c33a290\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.918525 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c427e03-4bb9-4dc4-a866-765e097e498f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g\" (UID: \"8c427e03-4bb9-4dc4-a866-765e097e498f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.918948 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e8c24809-b49a-4a7d-9fd8-58f83c33a290-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm\" (UID: \"e8c24809-b49a-4a7d-9fd8-58f83c33a290\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.920450 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c427e03-4bb9-4dc4-a866-765e097e498f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g\" (UID: \"8c427e03-4bb9-4dc4-a866-765e097e498f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.921896 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e8c24809-b49a-4a7d-9fd8-58f83c33a290-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm\" (UID: \"e8c24809-b49a-4a7d-9fd8-58f83c33a290\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.925866 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-b7497"] Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.927130 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.929615 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.934645 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-lgxfd" Oct 02 11:29:52 crc kubenswrapper[4658]: I1002 11:29:52.959138 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-b7497"] Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.014759 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwx9s\" (UniqueName: \"kubernetes.io/projected/6ae51e31-b742-4b5c-870a-d7bfc95151f1-kube-api-access-dwx9s\") pod \"observability-operator-cc5f78dfc-b7497\" (UID: \"6ae51e31-b742-4b5c-870a-d7bfc95151f1\") " pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.014855 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ae51e31-b742-4b5c-870a-d7bfc95151f1-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-b7497\" (UID: \"6ae51e31-b742-4b5c-870a-d7bfc95151f1\") " pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.036137 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.049568 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.115959 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ae51e31-b742-4b5c-870a-d7bfc95151f1-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-b7497\" (UID: \"6ae51e31-b742-4b5c-870a-d7bfc95151f1\") " pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.119351 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwx9s\" (UniqueName: \"kubernetes.io/projected/6ae51e31-b742-4b5c-870a-d7bfc95151f1-kube-api-access-dwx9s\") pod \"observability-operator-cc5f78dfc-b7497\" (UID: \"6ae51e31-b742-4b5c-870a-d7bfc95151f1\") " pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.129287 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-4qk6g"] Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.130569 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.140942 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ae51e31-b742-4b5c-870a-d7bfc95151f1-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-b7497\" (UID: \"6ae51e31-b742-4b5c-870a-d7bfc95151f1\") " pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.159757 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-4qk6g"] Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.160054 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-5vx5q" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.168043 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwx9s\" (UniqueName: \"kubernetes.io/projected/6ae51e31-b742-4b5c-870a-d7bfc95151f1-kube-api-access-dwx9s\") pod \"observability-operator-cc5f78dfc-b7497\" (UID: \"6ae51e31-b742-4b5c-870a-d7bfc95151f1\") " pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.222187 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k86b\" (UniqueName: \"kubernetes.io/projected/79a71fa2-31f7-4ce5-9043-cdfad20543ec-kube-api-access-5k86b\") pod \"perses-operator-54bc95c9fb-4qk6g\" (UID: \"79a71fa2-31f7-4ce5-9043-cdfad20543ec\") " pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.222261 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79a71fa2-31f7-4ce5-9043-cdfad20543ec-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-4qk6g\" (UID: \"79a71fa2-31f7-4ce5-9043-cdfad20543ec\") " pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.306111 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.323899 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k86b\" (UniqueName: \"kubernetes.io/projected/79a71fa2-31f7-4ce5-9043-cdfad20543ec-kube-api-access-5k86b\") pod \"perses-operator-54bc95c9fb-4qk6g\" (UID: \"79a71fa2-31f7-4ce5-9043-cdfad20543ec\") " pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.323945 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79a71fa2-31f7-4ce5-9043-cdfad20543ec-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-4qk6g\" (UID: \"79a71fa2-31f7-4ce5-9043-cdfad20543ec\") " pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.324963 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79a71fa2-31f7-4ce5-9043-cdfad20543ec-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-4qk6g\" (UID: \"79a71fa2-31f7-4ce5-9043-cdfad20543ec\") " pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.345909 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k86b\" (UniqueName: \"kubernetes.io/projected/79a71fa2-31f7-4ce5-9043-cdfad20543ec-kube-api-access-5k86b\") pod \"perses-operator-54bc95c9fb-4qk6g\" (UID: \"79a71fa2-31f7-4ce5-9043-cdfad20543ec\") " pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.366030 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g"] Oct 02 11:29:53 crc kubenswrapper[4658]: W1002 11:29:53.390107 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c427e03_4bb9_4dc4_a866_765e097e498f.slice/crio-a45ccb79c7184f31b463502fee74ff85bacbd8c711cacefa79f1023b93a4c68a WatchSource:0}: Error finding container a45ccb79c7184f31b463502fee74ff85bacbd8c711cacefa79f1023b93a4c68a: Status 404 returned error can't find the container with id a45ccb79c7184f31b463502fee74ff85bacbd8c711cacefa79f1023b93a4c68a Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.455882 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk"] Oct 02 11:29:53 crc kubenswrapper[4658]: W1002 11:29:53.467188 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6abb4e77_380e_45f9_94dd_0511e0194885.slice/crio-99db746a4c8f67f8eaf5dd323644b4351e887ade575d551a468bb2a132cd540a WatchSource:0}: Error finding container 99db746a4c8f67f8eaf5dd323644b4351e887ade575d551a468bb2a132cd540a: Status 404 returned error can't find the container with id 99db746a4c8f67f8eaf5dd323644b4351e887ade575d551a468bb2a132cd540a Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.477852 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.556270 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-b7497"] Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.556785 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" event={"ID":"8c427e03-4bb9-4dc4-a866-765e097e498f","Type":"ContainerStarted","Data":"a45ccb79c7184f31b463502fee74ff85bacbd8c711cacefa79f1023b93a4c68a"} Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.559511 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk" event={"ID":"6abb4e77-380e-45f9-94dd-0511e0194885","Type":"ContainerStarted","Data":"99db746a4c8f67f8eaf5dd323644b4351e887ade575d551a468bb2a132cd540a"} Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.633488 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm"] Oct 02 11:29:53 crc kubenswrapper[4658]: W1002 11:29:53.640895 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c24809_b49a_4a7d_9fd8_58f83c33a290.slice/crio-69d9b13137d9625e5f54f495bdae39be05db9785752ae1041d25ff83992f659b WatchSource:0}: Error finding container 69d9b13137d9625e5f54f495bdae39be05db9785752ae1041d25ff83992f659b: Status 404 returned error can't find the container with id 69d9b13137d9625e5f54f495bdae39be05db9785752ae1041d25ff83992f659b Oct 02 11:29:53 crc kubenswrapper[4658]: I1002 11:29:53.723472 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-4qk6g"] Oct 02 11:29:54 crc kubenswrapper[4658]: I1002 11:29:54.569276 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" event={"ID":"79a71fa2-31f7-4ce5-9043-cdfad20543ec","Type":"ContainerStarted","Data":"87e9ccf11929561d78e2097a111a8509a47dce209f946cfc216c7bae7f56f2f4"} Oct 02 11:29:54 crc kubenswrapper[4658]: I1002 11:29:54.570589 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" event={"ID":"e8c24809-b49a-4a7d-9fd8-58f83c33a290","Type":"ContainerStarted","Data":"69d9b13137d9625e5f54f495bdae39be05db9785752ae1041d25ff83992f659b"} Oct 02 11:29:54 crc kubenswrapper[4658]: I1002 11:29:54.571616 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-b7497" event={"ID":"6ae51e31-b742-4b5c-870a-d7bfc95151f1","Type":"ContainerStarted","Data":"dae57f498e3767dbf634b7eff5ba4a1643e6befdeb8f9aa344ad5282ff707e99"} Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.143698 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr"] Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.145156 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.148200 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.148215 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.164641 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr"] Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.213454 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa8c328-ed90-4500-866b-f66b33ad5528-secret-volume\") pod \"collect-profiles-29323410-4tkcr\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.213547 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729wz\" (UniqueName: \"kubernetes.io/projected/ffa8c328-ed90-4500-866b-f66b33ad5528-kube-api-access-729wz\") pod \"collect-profiles-29323410-4tkcr\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.213583 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa8c328-ed90-4500-866b-f66b33ad5528-config-volume\") pod \"collect-profiles-29323410-4tkcr\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.314596 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa8c328-ed90-4500-866b-f66b33ad5528-config-volume\") pod \"collect-profiles-29323410-4tkcr\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.314672 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa8c328-ed90-4500-866b-f66b33ad5528-secret-volume\") pod \"collect-profiles-29323410-4tkcr\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.314716 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-729wz\" (UniqueName: \"kubernetes.io/projected/ffa8c328-ed90-4500-866b-f66b33ad5528-kube-api-access-729wz\") pod \"collect-profiles-29323410-4tkcr\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.318573 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa8c328-ed90-4500-866b-f66b33ad5528-config-volume\") pod \"collect-profiles-29323410-4tkcr\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.323012 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa8c328-ed90-4500-866b-f66b33ad5528-secret-volume\") pod \"collect-profiles-29323410-4tkcr\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.356094 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-729wz\" (UniqueName: \"kubernetes.io/projected/ffa8c328-ed90-4500-866b-f66b33ad5528-kube-api-access-729wz\") pod \"collect-profiles-29323410-4tkcr\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:00 crc kubenswrapper[4658]: I1002 11:30:00.472665 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:10 crc kubenswrapper[4658]: E1002 11:30:10.570972 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:27ffe36aad6e606e6d0a211f48f3cdb58a53aa0d5e8ead6a444427231261ab9e" Oct 02 11:30:10 crc kubenswrapper[4658]: E1002 11:30:10.571844 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:27ffe36aad6e606e6d0a211f48f3cdb58a53aa0d5e8ead6a444427231261ab9e,Command:[],Args:[--namespace=$(NAMESPACE) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=perses=$(RELATED_IMAGE_PERSES) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:4d25b0e31549d780928d2dd3eed7defd9c6d460deb92dcff0fe72c5023029404,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:f3806c97420ec8ba91895ce7627df7612cccb927c05d7854377f45cdd6c924a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-0-50-rhel9@sha256:4b5e53d226733237fc5abd0476eb3c96162cf3d8da7aeba8deda631fa8987223,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-0-4-rhel9@sha256:53125bddbefca2ba2b57c3fd74bd4b376da803e420201220548878f557bd6610,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-1-0-rhel9@sha256:1dbe9a684271e00c8f36d8b96c9b22f6ee3c6f907ea6ad20980901bd533f9a3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-0-4-rhel9@sha256:6aafab2c90bcbc6702f2d63d585a764baa8de8207e6af7afa60f3976ddfa9bd3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-0-3-rhel9@sha256:9f80851e8137c2c5e5c2aee13fc663f6c7124d9524d88c06c1507748ce84e1ed,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-6-1-rhel9@sha256:2c9b2be12f15f06a24393dbab6a31682cee399d42e2cc04b0dcf03b2b598d5cf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-6-0-rhel9@sha256:e9042d93f624790c450724158a8323277e4dd136530c763fec8db31f51fd8552,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-0-4-rhel9@sha256:456d45001816b9adc38745e0ad8705bdc0150d03d0f65e0dfa9caf3fb8980fad,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-0-5-rhel9@sha256:f3446969c67c18b44bee38ac946091fe9397a2117cb5b7aacb39406461c1efe1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-0-4-rhel9@sha256:ade84f8be7d23bd4b9c80e07462dc947280f0bcf6071e6edd927fef54c254b7e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:039e139cf9217bbe72248674df76cbe4baf4bef9f8dc367d2cb51eae9c4aa9d7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:142180f277f0221ef2d4176f9af6dcdb4e7ab434a68f0dfad2ee5bee0e667ddd,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dwx9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-cc5f78dfc-b7497_openshift-operators(6ae51e31-b742-4b5c-870a-d7bfc95151f1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:30:10 crc kubenswrapper[4658]: E1002 11:30:10.573060 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-cc5f78dfc-b7497" podUID="6ae51e31-b742-4b5c-870a-d7bfc95151f1" Oct 02 11:30:10 crc kubenswrapper[4658]: E1002 11:30:10.708354 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:27ffe36aad6e606e6d0a211f48f3cdb58a53aa0d5e8ead6a444427231261ab9e\\\"\"" pod="openshift-operators/observability-operator-cc5f78dfc-b7497" podUID="6ae51e31-b742-4b5c-870a-d7bfc95151f1" Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.137206 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr"] Oct 02 11:30:11 crc kubenswrapper[4658]: W1002 11:30:11.157802 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa8c328_ed90_4500_866b_f66b33ad5528.slice/crio-e61aa5e0ea677ebc2abd9a28c448768bfea2288bf99a2cb17df66ac7e2d401e8 WatchSource:0}: Error finding container e61aa5e0ea677ebc2abd9a28c448768bfea2288bf99a2cb17df66ac7e2d401e8: Status 404 returned error can't find the container with id e61aa5e0ea677ebc2abd9a28c448768bfea2288bf99a2cb17df66ac7e2d401e8 Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.710444 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" event={"ID":"8c427e03-4bb9-4dc4-a866-765e097e498f","Type":"ContainerStarted","Data":"bdcae6f51436cad45b18323c50cefba07d2a4b2b2b2712a34590d619779af208"} Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.713243 4658 generic.go:334] "Generic (PLEG): container finished" podID="ffa8c328-ed90-4500-866b-f66b33ad5528" containerID="8441418dca09dd20a05408ca9d560ef275ee4372d85ff0c3e149bab9fb3e199c" exitCode=0 Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.713320 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" event={"ID":"ffa8c328-ed90-4500-866b-f66b33ad5528","Type":"ContainerDied","Data":"8441418dca09dd20a05408ca9d560ef275ee4372d85ff0c3e149bab9fb3e199c"} Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.713342 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" event={"ID":"ffa8c328-ed90-4500-866b-f66b33ad5528","Type":"ContainerStarted","Data":"e61aa5e0ea677ebc2abd9a28c448768bfea2288bf99a2cb17df66ac7e2d401e8"} Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.715305 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk" event={"ID":"6abb4e77-380e-45f9-94dd-0511e0194885","Type":"ContainerStarted","Data":"65907ff1a93052a96e0ae19b0c5b58e852068923bf7939391df424e29a59a49a"} Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.717119 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" event={"ID":"79a71fa2-31f7-4ce5-9043-cdfad20543ec","Type":"ContainerStarted","Data":"2b8bd4efc6479197bd27c6e8b0aa83b5fa78a05a09705193265028f055e7117f"} Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.717244 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.718428 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" event={"ID":"e8c24809-b49a-4a7d-9fd8-58f83c33a290","Type":"ContainerStarted","Data":"252c032806c6a033c553324ab720f9ad9b9699c0c4a691ff205ab875517230e1"} Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.787684 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm" podStartSLOduration=2.723914203 podStartE2EDuration="19.78766823s" podCreationTimestamp="2025-10-02 11:29:52 +0000 UTC" firstStartedPulling="2025-10-02 11:29:53.64544494 +0000 UTC m=+674.536598507" lastFinishedPulling="2025-10-02 11:30:10.709198967 +0000 UTC m=+691.600352534" observedRunningTime="2025-10-02 11:30:11.785590293 +0000 UTC m=+692.676743870" watchObservedRunningTime="2025-10-02 11:30:11.78766823 +0000 UTC m=+692.678821797" Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.790089 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g" podStartSLOduration=2.48727464 podStartE2EDuration="19.790080248s" podCreationTimestamp="2025-10-02 11:29:52 +0000 UTC" firstStartedPulling="2025-10-02 11:29:53.391586887 +0000 UTC m=+674.282740454" lastFinishedPulling="2025-10-02 11:30:10.694392495 +0000 UTC m=+691.585546062" observedRunningTime="2025-10-02 11:30:11.757083365 +0000 UTC m=+692.648236932" watchObservedRunningTime="2025-10-02 11:30:11.790080248 +0000 UTC m=+692.681233815" Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.814101 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" podStartSLOduration=1.83750496 podStartE2EDuration="18.81407952s" podCreationTimestamp="2025-10-02 11:29:53 +0000 UTC" firstStartedPulling="2025-10-02 11:29:53.732135002 +0000 UTC m=+674.623288579" lastFinishedPulling="2025-10-02 11:30:10.708709572 +0000 UTC m=+691.599863139" observedRunningTime="2025-10-02 11:30:11.812756557 +0000 UTC m=+692.703910134" watchObservedRunningTime="2025-10-02 11:30:11.81407952 +0000 UTC m=+692.705233087" Oct 02 11:30:11 crc kubenswrapper[4658]: I1002 11:30:11.851308 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wxwrk" podStartSLOduration=2.624602999 podStartE2EDuration="19.851233159s" podCreationTimestamp="2025-10-02 11:29:52 +0000 UTC" firstStartedPulling="2025-10-02 11:29:53.470562887 +0000 UTC m=+674.361716454" lastFinishedPulling="2025-10-02 11:30:10.697193037 +0000 UTC m=+691.588346614" observedRunningTime="2025-10-02 11:30:11.850811226 +0000 UTC m=+692.741964793" watchObservedRunningTime="2025-10-02 11:30:11.851233159 +0000 UTC m=+692.742386726" Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.008410 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.136744 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa8c328-ed90-4500-866b-f66b33ad5528-secret-volume\") pod \"ffa8c328-ed90-4500-866b-f66b33ad5528\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.136972 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa8c328-ed90-4500-866b-f66b33ad5528-config-volume\") pod \"ffa8c328-ed90-4500-866b-f66b33ad5528\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.137094 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-729wz\" (UniqueName: \"kubernetes.io/projected/ffa8c328-ed90-4500-866b-f66b33ad5528-kube-api-access-729wz\") pod \"ffa8c328-ed90-4500-866b-f66b33ad5528\" (UID: \"ffa8c328-ed90-4500-866b-f66b33ad5528\") " Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.137707 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa8c328-ed90-4500-866b-f66b33ad5528-config-volume" (OuterVolumeSpecName: "config-volume") pod "ffa8c328-ed90-4500-866b-f66b33ad5528" (UID: "ffa8c328-ed90-4500-866b-f66b33ad5528"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.142510 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa8c328-ed90-4500-866b-f66b33ad5528-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ffa8c328-ed90-4500-866b-f66b33ad5528" (UID: "ffa8c328-ed90-4500-866b-f66b33ad5528"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.148491 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa8c328-ed90-4500-866b-f66b33ad5528-kube-api-access-729wz" (OuterVolumeSpecName: "kube-api-access-729wz") pod "ffa8c328-ed90-4500-866b-f66b33ad5528" (UID: "ffa8c328-ed90-4500-866b-f66b33ad5528"). InnerVolumeSpecName "kube-api-access-729wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.238659 4658 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa8c328-ed90-4500-866b-f66b33ad5528-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.238718 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-729wz\" (UniqueName: \"kubernetes.io/projected/ffa8c328-ed90-4500-866b-f66b33ad5528-kube-api-access-729wz\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.238733 4658 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa8c328-ed90-4500-866b-f66b33ad5528-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.729826 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" event={"ID":"ffa8c328-ed90-4500-866b-f66b33ad5528","Type":"ContainerDied","Data":"e61aa5e0ea677ebc2abd9a28c448768bfea2288bf99a2cb17df66ac7e2d401e8"} Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.730107 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e61aa5e0ea677ebc2abd9a28c448768bfea2288bf99a2cb17df66ac7e2d401e8" Oct 02 11:30:13 crc kubenswrapper[4658]: I1002 11:30:13.729886 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr" Oct 02 11:30:23 crc kubenswrapper[4658]: I1002 11:30:23.481738 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-4qk6g" Oct 02 11:30:27 crc kubenswrapper[4658]: I1002 11:30:27.430225 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:30:27 crc kubenswrapper[4658]: I1002 11:30:27.430848 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:30:27 crc kubenswrapper[4658]: I1002 11:30:27.804726 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-b7497" event={"ID":"6ae51e31-b742-4b5c-870a-d7bfc95151f1","Type":"ContainerStarted","Data":"dc0ce41ec1362b357d481a51405f652bb3f2cb3219e93c91608735a5bba2d6cb"} Oct 02 11:30:27 crc kubenswrapper[4658]: I1002 11:30:27.805386 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:30:27 crc kubenswrapper[4658]: I1002 11:30:27.807398 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-b7497" Oct 02 11:30:27 crc kubenswrapper[4658]: I1002 11:30:27.823984 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-b7497" podStartSLOduration=2.632055672 podStartE2EDuration="35.823969455s" podCreationTimestamp="2025-10-02 11:29:52 +0000 UTC" firstStartedPulling="2025-10-02 11:29:53.57050193 +0000 UTC m=+674.461655497" lastFinishedPulling="2025-10-02 11:30:26.762415723 +0000 UTC m=+707.653569280" observedRunningTime="2025-10-02 11:30:27.822316992 +0000 UTC m=+708.713470579" watchObservedRunningTime="2025-10-02 11:30:27.823969455 +0000 UTC m=+708.715123022" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.342563 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb"] Oct 02 11:30:46 crc kubenswrapper[4658]: E1002 11:30:46.343383 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa8c328-ed90-4500-866b-f66b33ad5528" containerName="collect-profiles" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.343400 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa8c328-ed90-4500-866b-f66b33ad5528" containerName="collect-profiles" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.343516 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa8c328-ed90-4500-866b-f66b33ad5528" containerName="collect-profiles" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.344426 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.348910 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.353386 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb"] Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.486896 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.487021 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9hbl\" (UniqueName: \"kubernetes.io/projected/65876d37-f714-4df4-8631-442538f87981-kube-api-access-h9hbl\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.487058 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.588671 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9hbl\" (UniqueName: \"kubernetes.io/projected/65876d37-f714-4df4-8631-442538f87981-kube-api-access-h9hbl\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.588736 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.588774 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.589409 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.589667 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.608628 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9hbl\" (UniqueName: \"kubernetes.io/projected/65876d37-f714-4df4-8631-442538f87981-kube-api-access-h9hbl\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:46 crc kubenswrapper[4658]: I1002 11:30:46.660809 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:47 crc kubenswrapper[4658]: I1002 11:30:47.065654 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb"] Oct 02 11:30:47 crc kubenswrapper[4658]: I1002 11:30:47.922415 4658 generic.go:334] "Generic (PLEG): container finished" podID="65876d37-f714-4df4-8631-442538f87981" containerID="8e266024de4f2521d1c16b9ae0023d9cc6c119d809cd39f23542f8c139b4c666" exitCode=0 Oct 02 11:30:47 crc kubenswrapper[4658]: I1002 11:30:47.922462 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" event={"ID":"65876d37-f714-4df4-8631-442538f87981","Type":"ContainerDied","Data":"8e266024de4f2521d1c16b9ae0023d9cc6c119d809cd39f23542f8c139b4c666"} Oct 02 11:30:47 crc kubenswrapper[4658]: I1002 11:30:47.922504 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" event={"ID":"65876d37-f714-4df4-8631-442538f87981","Type":"ContainerStarted","Data":"ba84cb32c2dc9ca80304141d52e4383b4c3cc5c7c87f093542dcedc2275fc7ef"} Oct 02 11:30:50 crc kubenswrapper[4658]: I1002 11:30:50.944602 4658 generic.go:334] "Generic (PLEG): container finished" podID="65876d37-f714-4df4-8631-442538f87981" containerID="604d08dbde5e42962723056203476dda741edbc722c2cf641d912915fc17913f" exitCode=0 Oct 02 11:30:50 crc kubenswrapper[4658]: I1002 11:30:50.945974 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" event={"ID":"65876d37-f714-4df4-8631-442538f87981","Type":"ContainerDied","Data":"604d08dbde5e42962723056203476dda741edbc722c2cf641d912915fc17913f"} Oct 02 11:30:51 crc kubenswrapper[4658]: I1002 11:30:51.956864 4658 generic.go:334] "Generic (PLEG): container finished" podID="65876d37-f714-4df4-8631-442538f87981" containerID="450393237e33bfc9d96bd0da9c730440343b306155cd838a196c2fd83be11e11" exitCode=0 Oct 02 11:30:51 crc kubenswrapper[4658]: I1002 11:30:51.957202 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" event={"ID":"65876d37-f714-4df4-8631-442538f87981","Type":"ContainerDied","Data":"450393237e33bfc9d96bd0da9c730440343b306155cd838a196c2fd83be11e11"} Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.191083 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.283338 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-util\") pod \"65876d37-f714-4df4-8631-442538f87981\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.283488 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9hbl\" (UniqueName: \"kubernetes.io/projected/65876d37-f714-4df4-8631-442538f87981-kube-api-access-h9hbl\") pod \"65876d37-f714-4df4-8631-442538f87981\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.283525 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-bundle\") pod \"65876d37-f714-4df4-8631-442538f87981\" (UID: \"65876d37-f714-4df4-8631-442538f87981\") " Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.285434 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-bundle" (OuterVolumeSpecName: "bundle") pod "65876d37-f714-4df4-8631-442538f87981" (UID: "65876d37-f714-4df4-8631-442538f87981"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.291437 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65876d37-f714-4df4-8631-442538f87981-kube-api-access-h9hbl" (OuterVolumeSpecName: "kube-api-access-h9hbl") pod "65876d37-f714-4df4-8631-442538f87981" (UID: "65876d37-f714-4df4-8631-442538f87981"). InnerVolumeSpecName "kube-api-access-h9hbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.295091 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-util" (OuterVolumeSpecName: "util") pod "65876d37-f714-4df4-8631-442538f87981" (UID: "65876d37-f714-4df4-8631-442538f87981"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.384886 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9hbl\" (UniqueName: \"kubernetes.io/projected/65876d37-f714-4df4-8631-442538f87981-kube-api-access-h9hbl\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.384953 4658 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.384969 4658 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65876d37-f714-4df4-8631-442538f87981-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.968785 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" event={"ID":"65876d37-f714-4df4-8631-442538f87981","Type":"ContainerDied","Data":"ba84cb32c2dc9ca80304141d52e4383b4c3cc5c7c87f093542dcedc2275fc7ef"} Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.969029 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba84cb32c2dc9ca80304141d52e4383b4c3cc5c7c87f093542dcedc2275fc7ef" Oct 02 11:30:53 crc kubenswrapper[4658]: I1002 11:30:53.968852 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.429877 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.430955 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.878215 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-zvskn"] Oct 02 11:30:57 crc kubenswrapper[4658]: E1002 11:30:57.879056 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65876d37-f714-4df4-8631-442538f87981" containerName="extract" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.879078 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="65876d37-f714-4df4-8631-442538f87981" containerName="extract" Oct 02 11:30:57 crc kubenswrapper[4658]: E1002 11:30:57.879105 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65876d37-f714-4df4-8631-442538f87981" containerName="pull" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.879113 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="65876d37-f714-4df4-8631-442538f87981" containerName="pull" Oct 02 11:30:57 crc kubenswrapper[4658]: E1002 11:30:57.879126 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65876d37-f714-4df4-8631-442538f87981" containerName="util" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.879134 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="65876d37-f714-4df4-8631-442538f87981" containerName="util" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.879265 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="65876d37-f714-4df4-8631-442538f87981" containerName="extract" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.879734 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-zvskn" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.881882 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.882017 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.882362 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-552kk" Oct 02 11:30:57 crc kubenswrapper[4658]: I1002 11:30:57.891144 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-zvskn"] Oct 02 11:30:58 crc kubenswrapper[4658]: I1002 11:30:58.046284 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2qkn\" (UniqueName: \"kubernetes.io/projected/08cea959-43c4-4ecc-b38d-2960b5d8180c-kube-api-access-f2qkn\") pod \"nmstate-operator-858ddd8f98-zvskn\" (UID: \"08cea959-43c4-4ecc-b38d-2960b5d8180c\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-zvskn" Oct 02 11:30:58 crc kubenswrapper[4658]: I1002 11:30:58.147992 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2qkn\" (UniqueName: \"kubernetes.io/projected/08cea959-43c4-4ecc-b38d-2960b5d8180c-kube-api-access-f2qkn\") pod \"nmstate-operator-858ddd8f98-zvskn\" (UID: \"08cea959-43c4-4ecc-b38d-2960b5d8180c\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-zvskn" Oct 02 11:30:58 crc kubenswrapper[4658]: I1002 11:30:58.166660 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2qkn\" (UniqueName: \"kubernetes.io/projected/08cea959-43c4-4ecc-b38d-2960b5d8180c-kube-api-access-f2qkn\") pod \"nmstate-operator-858ddd8f98-zvskn\" (UID: \"08cea959-43c4-4ecc-b38d-2960b5d8180c\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-zvskn" Oct 02 11:30:58 crc kubenswrapper[4658]: I1002 11:30:58.195790 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-zvskn" Oct 02 11:30:58 crc kubenswrapper[4658]: I1002 11:30:58.385200 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-zvskn"] Oct 02 11:30:58 crc kubenswrapper[4658]: I1002 11:30:58.997118 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-zvskn" event={"ID":"08cea959-43c4-4ecc-b38d-2960b5d8180c","Type":"ContainerStarted","Data":"e2e82b1a52c5d98f00a61bd19554f9ca3094b5b098d2a4225f01955b28aec6a3"} Oct 02 11:31:01 crc kubenswrapper[4658]: I1002 11:31:01.009909 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-zvskn" event={"ID":"08cea959-43c4-4ecc-b38d-2960b5d8180c","Type":"ContainerStarted","Data":"7911d6896b9907fba5ce732abde7831a328b3faf1c4a0bdf945b933115e7e1d1"} Oct 02 11:31:01 crc kubenswrapper[4658]: I1002 11:31:01.025642 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-zvskn" podStartSLOduration=1.5559196929999999 podStartE2EDuration="4.025623423s" podCreationTimestamp="2025-10-02 11:30:57 +0000 UTC" firstStartedPulling="2025-10-02 11:30:58.395888382 +0000 UTC m=+739.287041949" lastFinishedPulling="2025-10-02 11:31:00.865592112 +0000 UTC m=+741.756745679" observedRunningTime="2025-10-02 11:31:01.023981701 +0000 UTC m=+741.915135278" watchObservedRunningTime="2025-10-02 11:31:01.025623423 +0000 UTC m=+741.916776980" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.467038 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq"] Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.469208 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.474910 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hhtw8" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.498456 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq"] Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.509555 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg"] Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.510466 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.515734 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.525673 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-rpw4d"] Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.526659 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.531868 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg"] Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.572356 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddw46\" (UniqueName: \"kubernetes.io/projected/485529a7-2da9-40c3-adff-56109c78dbc1-kube-api-access-ddw46\") pod \"nmstate-metrics-fdff9cb8d-g8smq\" (UID: \"485529a7-2da9-40c3-adff-56109c78dbc1\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.619128 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp"] Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.619827 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.624690 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-j7nxp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.624772 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.625004 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.629383 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp"] Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.673435 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddw46\" (UniqueName: \"kubernetes.io/projected/485529a7-2da9-40c3-adff-56109c78dbc1-kube-api-access-ddw46\") pod \"nmstate-metrics-fdff9cb8d-g8smq\" (UID: \"485529a7-2da9-40c3-adff-56109c78dbc1\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.673503 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7whl\" (UniqueName: \"kubernetes.io/projected/3f8f0836-7d23-4df5-8658-79d424122ab3-kube-api-access-n7whl\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.673525 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bf546db5-7a99-4338-9c1e-0aecfdf1d7fb-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-hbfjg\" (UID: \"bf546db5-7a99-4338-9c1e-0aecfdf1d7fb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.673574 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3f8f0836-7d23-4df5-8658-79d424122ab3-dbus-socket\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.673683 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfd2v\" (UniqueName: \"kubernetes.io/projected/bf546db5-7a99-4338-9c1e-0aecfdf1d7fb-kube-api-access-rfd2v\") pod \"nmstate-webhook-6cdbc54649-hbfjg\" (UID: \"bf546db5-7a99-4338-9c1e-0aecfdf1d7fb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.673784 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3f8f0836-7d23-4df5-8658-79d424122ab3-nmstate-lock\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.673820 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3f8f0836-7d23-4df5-8658-79d424122ab3-ovs-socket\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.694016 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddw46\" (UniqueName: \"kubernetes.io/projected/485529a7-2da9-40c3-adff-56109c78dbc1-kube-api-access-ddw46\") pod \"nmstate-metrics-fdff9cb8d-g8smq\" (UID: \"485529a7-2da9-40c3-adff-56109c78dbc1\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775185 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7whl\" (UniqueName: \"kubernetes.io/projected/3f8f0836-7d23-4df5-8658-79d424122ab3-kube-api-access-n7whl\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775562 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bf546db5-7a99-4338-9c1e-0aecfdf1d7fb-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-hbfjg\" (UID: \"bf546db5-7a99-4338-9c1e-0aecfdf1d7fb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775600 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/53ded798-0460-49d4-8c75-f21907458150-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775659 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m7qm\" (UniqueName: \"kubernetes.io/projected/53ded798-0460-49d4-8c75-f21907458150-kube-api-access-8m7qm\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775681 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/53ded798-0460-49d4-8c75-f21907458150-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775706 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3f8f0836-7d23-4df5-8658-79d424122ab3-dbus-socket\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: E1002 11:31:07.775709 4658 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775722 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfd2v\" (UniqueName: \"kubernetes.io/projected/bf546db5-7a99-4338-9c1e-0aecfdf1d7fb-kube-api-access-rfd2v\") pod \"nmstate-webhook-6cdbc54649-hbfjg\" (UID: \"bf546db5-7a99-4338-9c1e-0aecfdf1d7fb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775741 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3f8f0836-7d23-4df5-8658-79d424122ab3-nmstate-lock\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: E1002 11:31:07.775754 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf546db5-7a99-4338-9c1e-0aecfdf1d7fb-tls-key-pair podName:bf546db5-7a99-4338-9c1e-0aecfdf1d7fb nodeName:}" failed. No retries permitted until 2025-10-02 11:31:08.275738437 +0000 UTC m=+749.166892004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/bf546db5-7a99-4338-9c1e-0aecfdf1d7fb-tls-key-pair") pod "nmstate-webhook-6cdbc54649-hbfjg" (UID: "bf546db5-7a99-4338-9c1e-0aecfdf1d7fb") : secret "openshift-nmstate-webhook" not found Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775769 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3f8f0836-7d23-4df5-8658-79d424122ab3-ovs-socket\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775771 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3f8f0836-7d23-4df5-8658-79d424122ab3-nmstate-lock\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.775946 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3f8f0836-7d23-4df5-8658-79d424122ab3-ovs-socket\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.776027 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3f8f0836-7d23-4df5-8658-79d424122ab3-dbus-socket\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.792253 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b56b999c-9hrb4"] Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.792999 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.794353 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7whl\" (UniqueName: \"kubernetes.io/projected/3f8f0836-7d23-4df5-8658-79d424122ab3-kube-api-access-n7whl\") pod \"nmstate-handler-rpw4d\" (UID: \"3f8f0836-7d23-4df5-8658-79d424122ab3\") " pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.797559 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.800583 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfd2v\" (UniqueName: \"kubernetes.io/projected/bf546db5-7a99-4338-9c1e-0aecfdf1d7fb-kube-api-access-rfd2v\") pod \"nmstate-webhook-6cdbc54649-hbfjg\" (UID: \"bf546db5-7a99-4338-9c1e-0aecfdf1d7fb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.804330 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b56b999c-9hrb4"] Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.841222 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877250 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-service-ca\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877316 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-console-config\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877367 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/53ded798-0460-49d4-8c75-f21907458150-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877402 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-259p7\" (UniqueName: \"kubernetes.io/projected/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-kube-api-access-259p7\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877426 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m7qm\" (UniqueName: \"kubernetes.io/projected/53ded798-0460-49d4-8c75-f21907458150-kube-api-access-8m7qm\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877450 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/53ded798-0460-49d4-8c75-f21907458150-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877518 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-trusted-ca-bundle\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877546 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-oauth-serving-cert\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877568 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-console-oauth-config\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.877596 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-console-serving-cert\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: E1002 11:31:07.877580 4658 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 02 11:31:07 crc kubenswrapper[4658]: E1002 11:31:07.877655 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53ded798-0460-49d4-8c75-f21907458150-plugin-serving-cert podName:53ded798-0460-49d4-8c75-f21907458150 nodeName:}" failed. No retries permitted until 2025-10-02 11:31:08.377637107 +0000 UTC m=+749.268790674 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/53ded798-0460-49d4-8c75-f21907458150-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-nczcp" (UID: "53ded798-0460-49d4-8c75-f21907458150") : secret "plugin-serving-cert" not found Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.878817 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/53ded798-0460-49d4-8c75-f21907458150-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.895287 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m7qm\" (UniqueName: \"kubernetes.io/projected/53ded798-0460-49d4-8c75-f21907458150-kube-api-access-8m7qm\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.980176 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-259p7\" (UniqueName: \"kubernetes.io/projected/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-kube-api-access-259p7\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.980583 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-trusted-ca-bundle\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.980610 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-oauth-serving-cert\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.980636 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-console-oauth-config\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.980663 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-console-serving-cert\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.980743 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-service-ca\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.980781 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-console-config\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.982265 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-console-config\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.982486 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-oauth-serving-cert\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.983829 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-service-ca\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.984345 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-trusted-ca-bundle\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.985845 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-console-oauth-config\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:07 crc kubenswrapper[4658]: I1002 11:31:07.986419 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-console-serving-cert\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.000068 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-259p7\" (UniqueName: \"kubernetes.io/projected/56b1e2e8-6dc2-424d-aeda-582dc7bdaded-kube-api-access-259p7\") pod \"console-6b56b999c-9hrb4\" (UID: \"56b1e2e8-6dc2-424d-aeda-582dc7bdaded\") " pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.056342 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rpw4d" event={"ID":"3f8f0836-7d23-4df5-8658-79d424122ab3","Type":"ContainerStarted","Data":"df2cf7dc1f6e984376a44f63eb5cd7f86d762113e95bfd59ead7e23e286fe2d0"} Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.201923 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.237666 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq"] Oct 02 11:31:08 crc kubenswrapper[4658]: W1002 11:31:08.239780 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod485529a7_2da9_40c3_adff_56109c78dbc1.slice/crio-34ad4a1c3c49808ec621b5eaed543bc47087f06b786632b8eb598c3849609ce3 WatchSource:0}: Error finding container 34ad4a1c3c49808ec621b5eaed543bc47087f06b786632b8eb598c3849609ce3: Status 404 returned error can't find the container with id 34ad4a1c3c49808ec621b5eaed543bc47087f06b786632b8eb598c3849609ce3 Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.285036 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bf546db5-7a99-4338-9c1e-0aecfdf1d7fb-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-hbfjg\" (UID: \"bf546db5-7a99-4338-9c1e-0aecfdf1d7fb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.291034 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bf546db5-7a99-4338-9c1e-0aecfdf1d7fb-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-hbfjg\" (UID: \"bf546db5-7a99-4338-9c1e-0aecfdf1d7fb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.379854 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b56b999c-9hrb4"] Oct 02 11:31:08 crc kubenswrapper[4658]: W1002 11:31:08.387081 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b1e2e8_6dc2_424d_aeda_582dc7bdaded.slice/crio-a8031f361a9baa81e70666a5f417aebdb30f15e9955bee7641e030e44ce074a4 WatchSource:0}: Error finding container a8031f361a9baa81e70666a5f417aebdb30f15e9955bee7641e030e44ce074a4: Status 404 returned error can't find the container with id a8031f361a9baa81e70666a5f417aebdb30f15e9955bee7641e030e44ce074a4 Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.387165 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/53ded798-0460-49d4-8c75-f21907458150-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.391431 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/53ded798-0460-49d4-8c75-f21907458150-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-nczcp\" (UID: \"53ded798-0460-49d4-8c75-f21907458150\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.429587 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.544715 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.879020 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg"] Oct 02 11:31:08 crc kubenswrapper[4658]: W1002 11:31:08.882964 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf546db5_7a99_4338_9c1e_0aecfdf1d7fb.slice/crio-a8c37fc78158eb5ecffd55171110991bbd2d6e0579c9a29a10c569a2c5ecf62c WatchSource:0}: Error finding container a8c37fc78158eb5ecffd55171110991bbd2d6e0579c9a29a10c569a2c5ecf62c: Status 404 returned error can't find the container with id a8c37fc78158eb5ecffd55171110991bbd2d6e0579c9a29a10c569a2c5ecf62c Oct 02 11:31:08 crc kubenswrapper[4658]: I1002 11:31:08.933763 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp"] Oct 02 11:31:08 crc kubenswrapper[4658]: W1002 11:31:08.940652 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ded798_0460_49d4_8c75_f21907458150.slice/crio-e0a15d4ebd923c3deda5df3cb0a08fbc4d2d1e148e875364a4c944a5d4ec810c WatchSource:0}: Error finding container e0a15d4ebd923c3deda5df3cb0a08fbc4d2d1e148e875364a4c944a5d4ec810c: Status 404 returned error can't find the container with id e0a15d4ebd923c3deda5df3cb0a08fbc4d2d1e148e875364a4c944a5d4ec810c Oct 02 11:31:09 crc kubenswrapper[4658]: I1002 11:31:09.063509 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" event={"ID":"bf546db5-7a99-4338-9c1e-0aecfdf1d7fb","Type":"ContainerStarted","Data":"a8c37fc78158eb5ecffd55171110991bbd2d6e0579c9a29a10c569a2c5ecf62c"} Oct 02 11:31:09 crc kubenswrapper[4658]: I1002 11:31:09.067735 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b56b999c-9hrb4" event={"ID":"56b1e2e8-6dc2-424d-aeda-582dc7bdaded","Type":"ContainerStarted","Data":"14c07cc071a2acd67a496b79b7531a49dbaff9142916529330945af86ecdb69f"} Oct 02 11:31:09 crc kubenswrapper[4658]: I1002 11:31:09.067784 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b56b999c-9hrb4" event={"ID":"56b1e2e8-6dc2-424d-aeda-582dc7bdaded","Type":"ContainerStarted","Data":"a8031f361a9baa81e70666a5f417aebdb30f15e9955bee7641e030e44ce074a4"} Oct 02 11:31:09 crc kubenswrapper[4658]: I1002 11:31:09.070023 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq" event={"ID":"485529a7-2da9-40c3-adff-56109c78dbc1","Type":"ContainerStarted","Data":"34ad4a1c3c49808ec621b5eaed543bc47087f06b786632b8eb598c3849609ce3"} Oct 02 11:31:09 crc kubenswrapper[4658]: I1002 11:31:09.071619 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" event={"ID":"53ded798-0460-49d4-8c75-f21907458150","Type":"ContainerStarted","Data":"e0a15d4ebd923c3deda5df3cb0a08fbc4d2d1e148e875364a4c944a5d4ec810c"} Oct 02 11:31:09 crc kubenswrapper[4658]: I1002 11:31:09.087286 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b56b999c-9hrb4" podStartSLOduration=2.087270598 podStartE2EDuration="2.087270598s" podCreationTimestamp="2025-10-02 11:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:09.084917304 +0000 UTC m=+749.976070881" watchObservedRunningTime="2025-10-02 11:31:09.087270598 +0000 UTC m=+749.978424165" Oct 02 11:31:12 crc kubenswrapper[4658]: I1002 11:31:12.092403 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" event={"ID":"bf546db5-7a99-4338-9c1e-0aecfdf1d7fb","Type":"ContainerStarted","Data":"3bec53bf2f6a505857174227c8f3cbcb22524a964b5eea6114adb8a9b1b87f49"} Oct 02 11:31:12 crc kubenswrapper[4658]: I1002 11:31:12.093098 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:12 crc kubenswrapper[4658]: I1002 11:31:12.105574 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rpw4d" event={"ID":"3f8f0836-7d23-4df5-8658-79d424122ab3","Type":"ContainerStarted","Data":"138888af6e98286ec6ab05e6e5264979888007652801b3df91208e0073ef5b57"} Oct 02 11:31:12 crc kubenswrapper[4658]: I1002 11:31:12.105667 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:12 crc kubenswrapper[4658]: I1002 11:31:12.107694 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq" event={"ID":"485529a7-2da9-40c3-adff-56109c78dbc1","Type":"ContainerStarted","Data":"f0cca666123223d54c561cd9a0337df85d14c982c8d1c16efc455e0e8cf959d2"} Oct 02 11:31:12 crc kubenswrapper[4658]: I1002 11:31:12.109976 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" event={"ID":"53ded798-0460-49d4-8c75-f21907458150","Type":"ContainerStarted","Data":"05b7f7476e7d00ce5021555cfee70b22c042d24362d9c7a6a8220d4a9cfe21a2"} Oct 02 11:31:12 crc kubenswrapper[4658]: I1002 11:31:12.113446 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" podStartSLOduration=2.39978214 podStartE2EDuration="5.113422845s" podCreationTimestamp="2025-10-02 11:31:07 +0000 UTC" firstStartedPulling="2025-10-02 11:31:08.885148331 +0000 UTC m=+749.776301898" lastFinishedPulling="2025-10-02 11:31:11.598789036 +0000 UTC m=+752.489942603" observedRunningTime="2025-10-02 11:31:12.11070255 +0000 UTC m=+753.001856127" watchObservedRunningTime="2025-10-02 11:31:12.113422845 +0000 UTC m=+753.004576412" Oct 02 11:31:12 crc kubenswrapper[4658]: I1002 11:31:12.132056 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-rpw4d" podStartSLOduration=1.425550284 podStartE2EDuration="5.132039181s" podCreationTimestamp="2025-10-02 11:31:07 +0000 UTC" firstStartedPulling="2025-10-02 11:31:07.887657332 +0000 UTC m=+748.778810899" lastFinishedPulling="2025-10-02 11:31:11.594146229 +0000 UTC m=+752.485299796" observedRunningTime="2025-10-02 11:31:12.126974422 +0000 UTC m=+753.018127999" watchObservedRunningTime="2025-10-02 11:31:12.132039181 +0000 UTC m=+753.023192758" Oct 02 11:31:12 crc kubenswrapper[4658]: I1002 11:31:12.145728 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nczcp" podStartSLOduration=2.500695999 podStartE2EDuration="5.145706872s" podCreationTimestamp="2025-10-02 11:31:07 +0000 UTC" firstStartedPulling="2025-10-02 11:31:08.942922611 +0000 UTC m=+749.834076178" lastFinishedPulling="2025-10-02 11:31:11.587933484 +0000 UTC m=+752.479087051" observedRunningTime="2025-10-02 11:31:12.140842299 +0000 UTC m=+753.031995876" watchObservedRunningTime="2025-10-02 11:31:12.145706872 +0000 UTC m=+753.036860439" Oct 02 11:31:15 crc kubenswrapper[4658]: I1002 11:31:15.126979 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq" event={"ID":"485529a7-2da9-40c3-adff-56109c78dbc1","Type":"ContainerStarted","Data":"ff14754bf7ab46bd795a0618f7c9a2d6c2027fe3ce28270750cd0e7e95e480bc"} Oct 02 11:31:15 crc kubenswrapper[4658]: I1002 11:31:15.144742 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-g8smq" podStartSLOduration=1.696371385 podStartE2EDuration="8.144686523s" podCreationTimestamp="2025-10-02 11:31:07 +0000 UTC" firstStartedPulling="2025-10-02 11:31:08.241938842 +0000 UTC m=+749.133092409" lastFinishedPulling="2025-10-02 11:31:14.69025398 +0000 UTC m=+755.581407547" observedRunningTime="2025-10-02 11:31:15.142762273 +0000 UTC m=+756.033915860" watchObservedRunningTime="2025-10-02 11:31:15.144686523 +0000 UTC m=+756.035840090" Oct 02 11:31:17 crc kubenswrapper[4658]: I1002 11:31:17.872553 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-rpw4d" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.203001 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.203048 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.207145 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.264784 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2chjq"] Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.265034 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" podUID="e75513c8-260e-4da7-8a62-ef8cd4dc52f4" containerName="controller-manager" containerID="cri-o://f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15" gracePeriod=30 Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.385665 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792"] Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.385901 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" podUID="a48a6ed4-aed1-433f-85d2-08e6beaea953" containerName="route-controller-manager" containerID="cri-o://e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c" gracePeriod=30 Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.647000 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.728336 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.736215 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-proxy-ca-bundles\") pod \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.736330 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5jhk\" (UniqueName: \"kubernetes.io/projected/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-kube-api-access-m5jhk\") pod \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.736410 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-config\") pod \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.736435 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-client-ca\") pod \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.736458 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-serving-cert\") pod \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\" (UID: \"e75513c8-260e-4da7-8a62-ef8cd4dc52f4\") " Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.737211 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e75513c8-260e-4da7-8a62-ef8cd4dc52f4" (UID: "e75513c8-260e-4da7-8a62-ef8cd4dc52f4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.737885 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-client-ca" (OuterVolumeSpecName: "client-ca") pod "e75513c8-260e-4da7-8a62-ef8cd4dc52f4" (UID: "e75513c8-260e-4da7-8a62-ef8cd4dc52f4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.737916 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-config" (OuterVolumeSpecName: "config") pod "e75513c8-260e-4da7-8a62-ef8cd4dc52f4" (UID: "e75513c8-260e-4da7-8a62-ef8cd4dc52f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.741913 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e75513c8-260e-4da7-8a62-ef8cd4dc52f4" (UID: "e75513c8-260e-4da7-8a62-ef8cd4dc52f4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.741995 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-kube-api-access-m5jhk" (OuterVolumeSpecName: "kube-api-access-m5jhk") pod "e75513c8-260e-4da7-8a62-ef8cd4dc52f4" (UID: "e75513c8-260e-4da7-8a62-ef8cd4dc52f4"). InnerVolumeSpecName "kube-api-access-m5jhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.837185 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-config\") pod \"a48a6ed4-aed1-433f-85d2-08e6beaea953\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.837373 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-client-ca\") pod \"a48a6ed4-aed1-433f-85d2-08e6beaea953\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.837439 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48a6ed4-aed1-433f-85d2-08e6beaea953-serving-cert\") pod \"a48a6ed4-aed1-433f-85d2-08e6beaea953\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.837462 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w274r\" (UniqueName: \"kubernetes.io/projected/a48a6ed4-aed1-433f-85d2-08e6beaea953-kube-api-access-w274r\") pod \"a48a6ed4-aed1-433f-85d2-08e6beaea953\" (UID: \"a48a6ed4-aed1-433f-85d2-08e6beaea953\") " Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.837687 4658 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.837704 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.837713 4658 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.837725 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5jhk\" (UniqueName: \"kubernetes.io/projected/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-kube-api-access-m5jhk\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.837733 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75513c8-260e-4da7-8a62-ef8cd4dc52f4-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.838390 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-config" (OuterVolumeSpecName: "config") pod "a48a6ed4-aed1-433f-85d2-08e6beaea953" (UID: "a48a6ed4-aed1-433f-85d2-08e6beaea953"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.838470 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-client-ca" (OuterVolumeSpecName: "client-ca") pod "a48a6ed4-aed1-433f-85d2-08e6beaea953" (UID: "a48a6ed4-aed1-433f-85d2-08e6beaea953"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.840503 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a48a6ed4-aed1-433f-85d2-08e6beaea953-kube-api-access-w274r" (OuterVolumeSpecName: "kube-api-access-w274r") pod "a48a6ed4-aed1-433f-85d2-08e6beaea953" (UID: "a48a6ed4-aed1-433f-85d2-08e6beaea953"). InnerVolumeSpecName "kube-api-access-w274r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.840652 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a48a6ed4-aed1-433f-85d2-08e6beaea953-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a48a6ed4-aed1-433f-85d2-08e6beaea953" (UID: "a48a6ed4-aed1-433f-85d2-08e6beaea953"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.938468 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.939460 4658 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48a6ed4-aed1-433f-85d2-08e6beaea953-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.939496 4658 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48a6ed4-aed1-433f-85d2-08e6beaea953-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:18 crc kubenswrapper[4658]: I1002 11:31:18.939551 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w274r\" (UniqueName: \"kubernetes.io/projected/a48a6ed4-aed1-433f-85d2-08e6beaea953-kube-api-access-w274r\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.154066 4658 generic.go:334] "Generic (PLEG): container finished" podID="e75513c8-260e-4da7-8a62-ef8cd4dc52f4" containerID="f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15" exitCode=0 Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.154125 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.154127 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" event={"ID":"e75513c8-260e-4da7-8a62-ef8cd4dc52f4","Type":"ContainerDied","Data":"f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15"} Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.154614 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2chjq" event={"ID":"e75513c8-260e-4da7-8a62-ef8cd4dc52f4","Type":"ContainerDied","Data":"32ba0976ef03e44804dd0d355481f615aba3107c4b9788ce8ec71c707e4d93ed"} Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.154643 4658 scope.go:117] "RemoveContainer" containerID="f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.156733 4658 generic.go:334] "Generic (PLEG): container finished" podID="a48a6ed4-aed1-433f-85d2-08e6beaea953" containerID="e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c" exitCode=0 Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.159957 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.161886 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" event={"ID":"a48a6ed4-aed1-433f-85d2-08e6beaea953","Type":"ContainerDied","Data":"e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c"} Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.161937 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792" event={"ID":"a48a6ed4-aed1-433f-85d2-08e6beaea953","Type":"ContainerDied","Data":"d681b7cdc1babc5a4fd0a961c1fb069ee7d621c45faae875446e238e30c7f9c8"} Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.169863 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b56b999c-9hrb4" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.174158 4658 scope.go:117] "RemoveContainer" containerID="f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15" Oct 02 11:31:19 crc kubenswrapper[4658]: E1002 11:31:19.175271 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15\": container with ID starting with f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15 not found: ID does not exist" containerID="f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.175338 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15"} err="failed to get container status \"f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15\": rpc error: code = NotFound desc = could not find container \"f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15\": container with ID starting with f0ce7b236b6499d5f611c7af652abc022dec0526f3e9e525e08c187dcf841f15 not found: ID does not exist" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.175366 4658 scope.go:117] "RemoveContainer" containerID="e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.182917 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2chjq"] Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.192979 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2chjq"] Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.206591 4658 scope.go:117] "RemoveContainer" containerID="e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c" Oct 02 11:31:19 crc kubenswrapper[4658]: E1002 11:31:19.207199 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c\": container with ID starting with e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c not found: ID does not exist" containerID="e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.207264 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c"} err="failed to get container status \"e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c\": rpc error: code = NotFound desc = could not find container \"e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c\": container with ID starting with e056a33a93be316c08d6a6fbe435b01d43c0b0af67ffd097960b0dbd0d9b1d8c not found: ID does not exist" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.235365 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792"] Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.238750 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27792"] Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.251782 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-md7fr"] Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.677948 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7656689b8c-lh2nl"] Oct 02 11:31:19 crc kubenswrapper[4658]: E1002 11:31:19.678427 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48a6ed4-aed1-433f-85d2-08e6beaea953" containerName="route-controller-manager" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.678457 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48a6ed4-aed1-433f-85d2-08e6beaea953" containerName="route-controller-manager" Oct 02 11:31:19 crc kubenswrapper[4658]: E1002 11:31:19.678518 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75513c8-260e-4da7-8a62-ef8cd4dc52f4" containerName="controller-manager" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.678531 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75513c8-260e-4da7-8a62-ef8cd4dc52f4" containerName="controller-manager" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.678729 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75513c8-260e-4da7-8a62-ef8cd4dc52f4" containerName="controller-manager" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.678758 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="a48a6ed4-aed1-433f-85d2-08e6beaea953" containerName="route-controller-manager" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.679563 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.682576 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.683273 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.683273 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.684349 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.685129 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.685949 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.694260 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7656689b8c-lh2nl"] Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.694934 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.852272 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-serving-cert\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.852402 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnr92\" (UniqueName: \"kubernetes.io/projected/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-kube-api-access-xnr92\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.852460 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-client-ca\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.852551 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-config\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.852658 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-proxy-ca-bundles\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.953531 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-client-ca\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.953609 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-config\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.953658 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-proxy-ca-bundles\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.953692 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-serving-cert\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.953718 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnr92\" (UniqueName: \"kubernetes.io/projected/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-kube-api-access-xnr92\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.954542 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-client-ca\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.955177 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-config\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.955333 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-proxy-ca-bundles\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.959534 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a48a6ed4-aed1-433f-85d2-08e6beaea953" path="/var/lib/kubelet/pods/a48a6ed4-aed1-433f-85d2-08e6beaea953/volumes" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.960220 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75513c8-260e-4da7-8a62-ef8cd4dc52f4" path="/var/lib/kubelet/pods/e75513c8-260e-4da7-8a62-ef8cd4dc52f4/volumes" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.962840 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-serving-cert\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.973783 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnr92\" (UniqueName: \"kubernetes.io/projected/5e6730ee-7b49-43cf-a10e-ea5119e30fa7-kube-api-access-xnr92\") pod \"controller-manager-7656689b8c-lh2nl\" (UID: \"5e6730ee-7b49-43cf-a10e-ea5119e30fa7\") " pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:19 crc kubenswrapper[4658]: I1002 11:31:19.996661 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.107612 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn"] Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.108330 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.110139 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.110565 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.110710 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.110890 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.111026 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.122748 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.144084 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn"] Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.257650 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-client-ca\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.258572 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-config\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.258916 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvx2c\" (UniqueName: \"kubernetes.io/projected/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-kube-api-access-pvx2c\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.258985 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-serving-cert\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.369129 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvx2c\" (UniqueName: \"kubernetes.io/projected/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-kube-api-access-pvx2c\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.369193 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-serving-cert\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.369238 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-client-ca\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.369271 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-config\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.370634 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-client-ca\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.370897 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-config\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.375062 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-serving-cert\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.387976 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvx2c\" (UniqueName: \"kubernetes.io/projected/ef354e98-f75e-4e35-9d60-4f713b9bf3ea-kube-api-access-pvx2c\") pod \"route-controller-manager-75956f6b99-lpkgn\" (UID: \"ef354e98-f75e-4e35-9d60-4f713b9bf3ea\") " pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.447964 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.473189 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7656689b8c-lh2nl"] Oct 02 11:31:20 crc kubenswrapper[4658]: I1002 11:31:20.640598 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn"] Oct 02 11:31:20 crc kubenswrapper[4658]: W1002 11:31:20.647359 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef354e98_f75e_4e35_9d60_4f713b9bf3ea.slice/crio-9f8ce659a32999f5486e921bdca6bb6e1390c191608707df110c12c67c235bf6 WatchSource:0}: Error finding container 9f8ce659a32999f5486e921bdca6bb6e1390c191608707df110c12c67c235bf6: Status 404 returned error can't find the container with id 9f8ce659a32999f5486e921bdca6bb6e1390c191608707df110c12c67c235bf6 Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.170490 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" event={"ID":"5e6730ee-7b49-43cf-a10e-ea5119e30fa7","Type":"ContainerStarted","Data":"38b4d108637586932e93b2094a785d63953033a98e04faa0eb5e18fa30c2b0c3"} Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.170540 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" event={"ID":"5e6730ee-7b49-43cf-a10e-ea5119e30fa7","Type":"ContainerStarted","Data":"63047361e9ea8da357f7581dc2f02dde1f2ede248855c5c69dc8d5ee257d7610"} Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.170697 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.172149 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" event={"ID":"ef354e98-f75e-4e35-9d60-4f713b9bf3ea","Type":"ContainerStarted","Data":"b3de0f97cedc4fe66ceb8103024aac2a4ab8daffc7d2c0340ef6355f0b9ee6fe"} Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.172197 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" event={"ID":"ef354e98-f75e-4e35-9d60-4f713b9bf3ea","Type":"ContainerStarted","Data":"9f8ce659a32999f5486e921bdca6bb6e1390c191608707df110c12c67c235bf6"} Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.172313 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.175964 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.197073 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7656689b8c-lh2nl" podStartSLOduration=2.197051981 podStartE2EDuration="2.197051981s" podCreationTimestamp="2025-10-02 11:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:21.195225983 +0000 UTC m=+762.086379560" watchObservedRunningTime="2025-10-02 11:31:21.197051981 +0000 UTC m=+762.088205548" Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.231946 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" podStartSLOduration=3.23193123 podStartE2EDuration="3.23193123s" podCreationTimestamp="2025-10-02 11:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:21.229984459 +0000 UTC m=+762.121138046" watchObservedRunningTime="2025-10-02 11:31:21.23193123 +0000 UTC m=+762.123084787" Oct 02 11:31:21 crc kubenswrapper[4658]: I1002 11:31:21.553585 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75956f6b99-lpkgn" Oct 02 11:31:26 crc kubenswrapper[4658]: I1002 11:31:26.256804 4658 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 11:31:27 crc kubenswrapper[4658]: I1002 11:31:27.429872 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:31:27 crc kubenswrapper[4658]: I1002 11:31:27.429919 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:31:27 crc kubenswrapper[4658]: I1002 11:31:27.429961 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:31:27 crc kubenswrapper[4658]: I1002 11:31:27.430461 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bbea38b7c4b625206d3cc6d00d2f3c0a2ccd06911eb1caf35974de1edfbf91d"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:31:27 crc kubenswrapper[4658]: I1002 11:31:27.430503 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://2bbea38b7c4b625206d3cc6d00d2f3c0a2ccd06911eb1caf35974de1edfbf91d" gracePeriod=600 Oct 02 11:31:28 crc kubenswrapper[4658]: I1002 11:31:28.216415 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="2bbea38b7c4b625206d3cc6d00d2f3c0a2ccd06911eb1caf35974de1edfbf91d" exitCode=0 Oct 02 11:31:28 crc kubenswrapper[4658]: I1002 11:31:28.216477 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"2bbea38b7c4b625206d3cc6d00d2f3c0a2ccd06911eb1caf35974de1edfbf91d"} Oct 02 11:31:28 crc kubenswrapper[4658]: I1002 11:31:28.216777 4658 scope.go:117] "RemoveContainer" containerID="5bc01ce3e07d1b6a10970b2c99f735c11379957461a2c770db550fe5be4c1278" Oct 02 11:31:28 crc kubenswrapper[4658]: I1002 11:31:28.436345 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hbfjg" Oct 02 11:31:29 crc kubenswrapper[4658]: I1002 11:31:29.226615 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"d11d8049b244ab8835831d1427eb5be75c611efce4e7cb5b809ccc2a5ccfd02a"} Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.540569 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb"] Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.543409 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.546940 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.548544 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb"] Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.585166 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.585566 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj422\" (UniqueName: \"kubernetes.io/projected/96b70650-2104-48e7-80fb-a2294a277006-kube-api-access-vj422\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.585604 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.686150 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.686210 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj422\" (UniqueName: \"kubernetes.io/projected/96b70650-2104-48e7-80fb-a2294a277006-kube-api-access-vj422\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.686246 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.687103 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.687185 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.702926 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj422\" (UniqueName: \"kubernetes.io/projected/96b70650-2104-48e7-80fb-a2294a277006-kube-api-access-vj422\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:41 crc kubenswrapper[4658]: I1002 11:31:41.857623 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:42 crc kubenswrapper[4658]: I1002 11:31:42.318524 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb"] Oct 02 11:31:42 crc kubenswrapper[4658]: W1002 11:31:42.325740 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96b70650_2104_48e7_80fb_a2294a277006.slice/crio-9f02a20fd06bd9f283071f415ae12b87cf107f73dc542932a4984de74adc87ee WatchSource:0}: Error finding container 9f02a20fd06bd9f283071f415ae12b87cf107f73dc542932a4984de74adc87ee: Status 404 returned error can't find the container with id 9f02a20fd06bd9f283071f415ae12b87cf107f73dc542932a4984de74adc87ee Oct 02 11:31:43 crc kubenswrapper[4658]: I1002 11:31:43.323358 4658 generic.go:334] "Generic (PLEG): container finished" podID="96b70650-2104-48e7-80fb-a2294a277006" containerID="ba4d7f7d522f78074a8e88d3e53904df70093a18cc7d170bea04b1f4f1bf85a3" exitCode=0 Oct 02 11:31:43 crc kubenswrapper[4658]: I1002 11:31:43.323424 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" event={"ID":"96b70650-2104-48e7-80fb-a2294a277006","Type":"ContainerDied","Data":"ba4d7f7d522f78074a8e88d3e53904df70093a18cc7d170bea04b1f4f1bf85a3"} Oct 02 11:31:43 crc kubenswrapper[4658]: I1002 11:31:43.323482 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" event={"ID":"96b70650-2104-48e7-80fb-a2294a277006","Type":"ContainerStarted","Data":"9f02a20fd06bd9f283071f415ae12b87cf107f73dc542932a4984de74adc87ee"} Oct 02 11:31:43 crc kubenswrapper[4658]: I1002 11:31:43.900305 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qcss4"] Oct 02 11:31:43 crc kubenswrapper[4658]: I1002 11:31:43.901946 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:43 crc kubenswrapper[4658]: I1002 11:31:43.914001 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qcss4"] Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.015034 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-utilities\") pod \"redhat-operators-qcss4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.015254 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxwt4\" (UniqueName: \"kubernetes.io/projected/58ac3642-4bf7-477b-af65-642bf2a7b7b4-kube-api-access-sxwt4\") pod \"redhat-operators-qcss4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.015350 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-catalog-content\") pod \"redhat-operators-qcss4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.116123 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxwt4\" (UniqueName: \"kubernetes.io/projected/58ac3642-4bf7-477b-af65-642bf2a7b7b4-kube-api-access-sxwt4\") pod \"redhat-operators-qcss4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.116180 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-catalog-content\") pod \"redhat-operators-qcss4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.116227 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-utilities\") pod \"redhat-operators-qcss4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.116785 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-catalog-content\") pod \"redhat-operators-qcss4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.116815 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-utilities\") pod \"redhat-operators-qcss4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.147479 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxwt4\" (UniqueName: \"kubernetes.io/projected/58ac3642-4bf7-477b-af65-642bf2a7b7b4-kube-api-access-sxwt4\") pod \"redhat-operators-qcss4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.234592 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.303003 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-md7fr" podUID="4082750e-cf12-45b4-8920-63f31ad1cc28" containerName="console" containerID="cri-o://9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77" gracePeriod=15 Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.740692 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qcss4"] Oct 02 11:31:44 crc kubenswrapper[4658]: W1002 11:31:44.743006 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58ac3642_4bf7_477b_af65_642bf2a7b7b4.slice/crio-0984d6dbc4df11d67694b0c830619343581b52e141a9aabbd9379b107d050669 WatchSource:0}: Error finding container 0984d6dbc4df11d67694b0c830619343581b52e141a9aabbd9379b107d050669: Status 404 returned error can't find the container with id 0984d6dbc4df11d67694b0c830619343581b52e141a9aabbd9379b107d050669 Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.887024 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-md7fr_4082750e-cf12-45b4-8920-63f31ad1cc28/console/0.log" Oct 02 11:31:44 crc kubenswrapper[4658]: I1002 11:31:44.887337 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.029399 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-trusted-ca-bundle\") pod \"4082750e-cf12-45b4-8920-63f31ad1cc28\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.029440 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-service-ca\") pod \"4082750e-cf12-45b4-8920-63f31ad1cc28\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.029486 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-oauth-config\") pod \"4082750e-cf12-45b4-8920-63f31ad1cc28\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.029510 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-serving-cert\") pod \"4082750e-cf12-45b4-8920-63f31ad1cc28\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.029591 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-oauth-serving-cert\") pod \"4082750e-cf12-45b4-8920-63f31ad1cc28\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.029615 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-console-config\") pod \"4082750e-cf12-45b4-8920-63f31ad1cc28\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.030324 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-console-config" (OuterVolumeSpecName: "console-config") pod "4082750e-cf12-45b4-8920-63f31ad1cc28" (UID: "4082750e-cf12-45b4-8920-63f31ad1cc28"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.030336 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4082750e-cf12-45b4-8920-63f31ad1cc28" (UID: "4082750e-cf12-45b4-8920-63f31ad1cc28"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.030403 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m98wx\" (UniqueName: \"kubernetes.io/projected/4082750e-cf12-45b4-8920-63f31ad1cc28-kube-api-access-m98wx\") pod \"4082750e-cf12-45b4-8920-63f31ad1cc28\" (UID: \"4082750e-cf12-45b4-8920-63f31ad1cc28\") " Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.030635 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4082750e-cf12-45b4-8920-63f31ad1cc28" (UID: "4082750e-cf12-45b4-8920-63f31ad1cc28"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.030840 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-service-ca" (OuterVolumeSpecName: "service-ca") pod "4082750e-cf12-45b4-8920-63f31ad1cc28" (UID: "4082750e-cf12-45b4-8920-63f31ad1cc28"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.031210 4658 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.031227 4658 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.031238 4658 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.036916 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4082750e-cf12-45b4-8920-63f31ad1cc28" (UID: "4082750e-cf12-45b4-8920-63f31ad1cc28"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.037382 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4082750e-cf12-45b4-8920-63f31ad1cc28" (UID: "4082750e-cf12-45b4-8920-63f31ad1cc28"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.037850 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4082750e-cf12-45b4-8920-63f31ad1cc28-kube-api-access-m98wx" (OuterVolumeSpecName: "kube-api-access-m98wx") pod "4082750e-cf12-45b4-8920-63f31ad1cc28" (UID: "4082750e-cf12-45b4-8920-63f31ad1cc28"). InnerVolumeSpecName "kube-api-access-m98wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.132317 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m98wx\" (UniqueName: \"kubernetes.io/projected/4082750e-cf12-45b4-8920-63f31ad1cc28-kube-api-access-m98wx\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.132355 4658 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4082750e-cf12-45b4-8920-63f31ad1cc28-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.132368 4658 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.132381 4658 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4082750e-cf12-45b4-8920-63f31ad1cc28-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.336381 4658 generic.go:334] "Generic (PLEG): container finished" podID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerID="ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39" exitCode=0 Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.336424 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcss4" event={"ID":"58ac3642-4bf7-477b-af65-642bf2a7b7b4","Type":"ContainerDied","Data":"ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39"} Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.336468 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcss4" event={"ID":"58ac3642-4bf7-477b-af65-642bf2a7b7b4","Type":"ContainerStarted","Data":"0984d6dbc4df11d67694b0c830619343581b52e141a9aabbd9379b107d050669"} Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.339993 4658 generic.go:334] "Generic (PLEG): container finished" podID="96b70650-2104-48e7-80fb-a2294a277006" containerID="00292cce2312d38de073c7c39a6f73254a9b96ecb3861f22c697ae484a61111a" exitCode=0 Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.340041 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" event={"ID":"96b70650-2104-48e7-80fb-a2294a277006","Type":"ContainerDied","Data":"00292cce2312d38de073c7c39a6f73254a9b96ecb3861f22c697ae484a61111a"} Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.342487 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-md7fr_4082750e-cf12-45b4-8920-63f31ad1cc28/console/0.log" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.342537 4658 generic.go:334] "Generic (PLEG): container finished" podID="4082750e-cf12-45b4-8920-63f31ad1cc28" containerID="9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77" exitCode=2 Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.342563 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-md7fr" event={"ID":"4082750e-cf12-45b4-8920-63f31ad1cc28","Type":"ContainerDied","Data":"9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77"} Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.342585 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-md7fr" event={"ID":"4082750e-cf12-45b4-8920-63f31ad1cc28","Type":"ContainerDied","Data":"3021f14394484fd84cd612e756649c0d9db6ae2680dcc8adb20be503382eedbb"} Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.342605 4658 scope.go:117] "RemoveContainer" containerID="9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.342676 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-md7fr" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.359836 4658 scope.go:117] "RemoveContainer" containerID="9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77" Oct 02 11:31:45 crc kubenswrapper[4658]: E1002 11:31:45.360323 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77\": container with ID starting with 9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77 not found: ID does not exist" containerID="9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.360365 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77"} err="failed to get container status \"9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77\": rpc error: code = NotFound desc = could not find container \"9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77\": container with ID starting with 9f6f46dad292d145dd676e149eb7c76ad748ebb92b99c2b314331bf3c4fc5f77 not found: ID does not exist" Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.398416 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-md7fr"] Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.401469 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-md7fr"] Oct 02 11:31:45 crc kubenswrapper[4658]: I1002 11:31:45.957399 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4082750e-cf12-45b4-8920-63f31ad1cc28" path="/var/lib/kubelet/pods/4082750e-cf12-45b4-8920-63f31ad1cc28/volumes" Oct 02 11:31:46 crc kubenswrapper[4658]: I1002 11:31:46.350174 4658 generic.go:334] "Generic (PLEG): container finished" podID="96b70650-2104-48e7-80fb-a2294a277006" containerID="7718be085e0883c34580c4fe0724b31fdf4564f106dd4ba932f281a1dc40b5d2" exitCode=0 Oct 02 11:31:46 crc kubenswrapper[4658]: I1002 11:31:46.350263 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" event={"ID":"96b70650-2104-48e7-80fb-a2294a277006","Type":"ContainerDied","Data":"7718be085e0883c34580c4fe0724b31fdf4564f106dd4ba932f281a1dc40b5d2"} Oct 02 11:31:46 crc kubenswrapper[4658]: I1002 11:31:46.354137 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcss4" event={"ID":"58ac3642-4bf7-477b-af65-642bf2a7b7b4","Type":"ContainerStarted","Data":"9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9"} Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.361587 4658 generic.go:334] "Generic (PLEG): container finished" podID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerID="9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9" exitCode=0 Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.361685 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcss4" event={"ID":"58ac3642-4bf7-477b-af65-642bf2a7b7b4","Type":"ContainerDied","Data":"9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9"} Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.772565 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.881407 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-util\") pod \"96b70650-2104-48e7-80fb-a2294a277006\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.881494 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj422\" (UniqueName: \"kubernetes.io/projected/96b70650-2104-48e7-80fb-a2294a277006-kube-api-access-vj422\") pod \"96b70650-2104-48e7-80fb-a2294a277006\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.881629 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-bundle\") pod \"96b70650-2104-48e7-80fb-a2294a277006\" (UID: \"96b70650-2104-48e7-80fb-a2294a277006\") " Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.882645 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-bundle" (OuterVolumeSpecName: "bundle") pod "96b70650-2104-48e7-80fb-a2294a277006" (UID: "96b70650-2104-48e7-80fb-a2294a277006"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.888136 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b70650-2104-48e7-80fb-a2294a277006-kube-api-access-vj422" (OuterVolumeSpecName: "kube-api-access-vj422") pod "96b70650-2104-48e7-80fb-a2294a277006" (UID: "96b70650-2104-48e7-80fb-a2294a277006"). InnerVolumeSpecName "kube-api-access-vj422". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.983095 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj422\" (UniqueName: \"kubernetes.io/projected/96b70650-2104-48e7-80fb-a2294a277006-kube-api-access-vj422\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:47 crc kubenswrapper[4658]: I1002 11:31:47.983136 4658 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:48 crc kubenswrapper[4658]: I1002 11:31:48.168412 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-util" (OuterVolumeSpecName: "util") pod "96b70650-2104-48e7-80fb-a2294a277006" (UID: "96b70650-2104-48e7-80fb-a2294a277006"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:48 crc kubenswrapper[4658]: I1002 11:31:48.184915 4658 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96b70650-2104-48e7-80fb-a2294a277006-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:48 crc kubenswrapper[4658]: I1002 11:31:48.370419 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" event={"ID":"96b70650-2104-48e7-80fb-a2294a277006","Type":"ContainerDied","Data":"9f02a20fd06bd9f283071f415ae12b87cf107f73dc542932a4984de74adc87ee"} Oct 02 11:31:48 crc kubenswrapper[4658]: I1002 11:31:48.370476 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f02a20fd06bd9f283071f415ae12b87cf107f73dc542932a4984de74adc87ee" Oct 02 11:31:48 crc kubenswrapper[4658]: I1002 11:31:48.370446 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb" Oct 02 11:31:48 crc kubenswrapper[4658]: I1002 11:31:48.374272 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcss4" event={"ID":"58ac3642-4bf7-477b-af65-642bf2a7b7b4","Type":"ContainerStarted","Data":"9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c"} Oct 02 11:31:48 crc kubenswrapper[4658]: I1002 11:31:48.400327 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qcss4" podStartSLOduration=2.527962996 podStartE2EDuration="5.400311454s" podCreationTimestamp="2025-10-02 11:31:43 +0000 UTC" firstStartedPulling="2025-10-02 11:31:45.337982356 +0000 UTC m=+786.229135933" lastFinishedPulling="2025-10-02 11:31:48.210330804 +0000 UTC m=+789.101484391" observedRunningTime="2025-10-02 11:31:48.396229823 +0000 UTC m=+789.287383410" watchObservedRunningTime="2025-10-02 11:31:48.400311454 +0000 UTC m=+789.291465021" Oct 02 11:31:54 crc kubenswrapper[4658]: I1002 11:31:54.234811 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:54 crc kubenswrapper[4658]: I1002 11:31:54.235247 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:54 crc kubenswrapper[4658]: I1002 11:31:54.271858 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:54 crc kubenswrapper[4658]: I1002 11:31:54.442675 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:55 crc kubenswrapper[4658]: I1002 11:31:55.694472 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qcss4"] Oct 02 11:31:56 crc kubenswrapper[4658]: I1002 11:31:56.416432 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qcss4" podUID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerName="registry-server" containerID="cri-o://9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c" gracePeriod=2 Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.204495 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq"] Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.205377 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b70650-2104-48e7-80fb-a2294a277006" containerName="pull" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.205395 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b70650-2104-48e7-80fb-a2294a277006" containerName="pull" Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.205410 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4082750e-cf12-45b4-8920-63f31ad1cc28" containerName="console" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.205417 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4082750e-cf12-45b4-8920-63f31ad1cc28" containerName="console" Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.205430 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b70650-2104-48e7-80fb-a2294a277006" containerName="extract" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.205439 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b70650-2104-48e7-80fb-a2294a277006" containerName="extract" Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.205455 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b70650-2104-48e7-80fb-a2294a277006" containerName="util" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.205463 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b70650-2104-48e7-80fb-a2294a277006" containerName="util" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.205586 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b70650-2104-48e7-80fb-a2294a277006" containerName="extract" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.205605 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4082750e-cf12-45b4-8920-63f31ad1cc28" containerName="console" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.206167 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.210784 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.211011 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.211260 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.211404 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.211534 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5nlfp" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.228358 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq"] Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.330812 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f9c60d31-755b-4e0e-888c-072203581d0d-apiservice-cert\") pod \"metallb-operator-controller-manager-5c6495c478-cxldq\" (UID: \"f9c60d31-755b-4e0e-888c-072203581d0d\") " pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.330893 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f9c60d31-755b-4e0e-888c-072203581d0d-webhook-cert\") pod \"metallb-operator-controller-manager-5c6495c478-cxldq\" (UID: \"f9c60d31-755b-4e0e-888c-072203581d0d\") " pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.331023 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfx5h\" (UniqueName: \"kubernetes.io/projected/f9c60d31-755b-4e0e-888c-072203581d0d-kube-api-access-pfx5h\") pod \"metallb-operator-controller-manager-5c6495c478-cxldq\" (UID: \"f9c60d31-755b-4e0e-888c-072203581d0d\") " pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.422803 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.424156 4658 generic.go:334] "Generic (PLEG): container finished" podID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerID="9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c" exitCode=0 Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.424206 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcss4" event={"ID":"58ac3642-4bf7-477b-af65-642bf2a7b7b4","Type":"ContainerDied","Data":"9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c"} Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.424237 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcss4" event={"ID":"58ac3642-4bf7-477b-af65-642bf2a7b7b4","Type":"ContainerDied","Data":"0984d6dbc4df11d67694b0c830619343581b52e141a9aabbd9379b107d050669"} Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.424256 4658 scope.go:117] "RemoveContainer" containerID="9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.432436 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfx5h\" (UniqueName: \"kubernetes.io/projected/f9c60d31-755b-4e0e-888c-072203581d0d-kube-api-access-pfx5h\") pod \"metallb-operator-controller-manager-5c6495c478-cxldq\" (UID: \"f9c60d31-755b-4e0e-888c-072203581d0d\") " pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.432525 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f9c60d31-755b-4e0e-888c-072203581d0d-apiservice-cert\") pod \"metallb-operator-controller-manager-5c6495c478-cxldq\" (UID: \"f9c60d31-755b-4e0e-888c-072203581d0d\") " pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.432544 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f9c60d31-755b-4e0e-888c-072203581d0d-webhook-cert\") pod \"metallb-operator-controller-manager-5c6495c478-cxldq\" (UID: \"f9c60d31-755b-4e0e-888c-072203581d0d\") " pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.439524 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f9c60d31-755b-4e0e-888c-072203581d0d-webhook-cert\") pod \"metallb-operator-controller-manager-5c6495c478-cxldq\" (UID: \"f9c60d31-755b-4e0e-888c-072203581d0d\") " pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.443390 4658 scope.go:117] "RemoveContainer" containerID="9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.455178 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfx5h\" (UniqueName: \"kubernetes.io/projected/f9c60d31-755b-4e0e-888c-072203581d0d-kube-api-access-pfx5h\") pod \"metallb-operator-controller-manager-5c6495c478-cxldq\" (UID: \"f9c60d31-755b-4e0e-888c-072203581d0d\") " pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.455201 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f9c60d31-755b-4e0e-888c-072203581d0d-apiservice-cert\") pod \"metallb-operator-controller-manager-5c6495c478-cxldq\" (UID: \"f9c60d31-755b-4e0e-888c-072203581d0d\") " pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.494035 4658 scope.go:117] "RemoveContainer" containerID="ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.512089 4658 scope.go:117] "RemoveContainer" containerID="9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c" Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.512783 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c\": container with ID starting with 9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c not found: ID does not exist" containerID="9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.512821 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c"} err="failed to get container status \"9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c\": rpc error: code = NotFound desc = could not find container \"9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c\": container with ID starting with 9de5e940bc4d14d3697a614da8c9899e9a2e2917c2c1f8545ad793303d64b83c not found: ID does not exist" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.512843 4658 scope.go:117] "RemoveContainer" containerID="9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9" Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.513260 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9\": container with ID starting with 9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9 not found: ID does not exist" containerID="9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.513312 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9"} err="failed to get container status \"9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9\": rpc error: code = NotFound desc = could not find container \"9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9\": container with ID starting with 9dc04805062a7671ae8f55e2ca6316ec0cf4cadb185045506f13c38029d89bd9 not found: ID does not exist" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.513338 4658 scope.go:117] "RemoveContainer" containerID="ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39" Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.513760 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39\": container with ID starting with ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39 not found: ID does not exist" containerID="ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.513793 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39"} err="failed to get container status \"ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39\": rpc error: code = NotFound desc = could not find container \"ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39\": container with ID starting with ac63318aeb9e98246f85b61b340381c4661f40ac35fb382bcead52481db07c39 not found: ID does not exist" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.529191 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.534851 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-utilities\") pod \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.534906 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxwt4\" (UniqueName: \"kubernetes.io/projected/58ac3642-4bf7-477b-af65-642bf2a7b7b4-kube-api-access-sxwt4\") pod \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.534960 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-catalog-content\") pod \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\" (UID: \"58ac3642-4bf7-477b-af65-642bf2a7b7b4\") " Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.535862 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-utilities" (OuterVolumeSpecName: "utilities") pod "58ac3642-4bf7-477b-af65-642bf2a7b7b4" (UID: "58ac3642-4bf7-477b-af65-642bf2a7b7b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.539349 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ac3642-4bf7-477b-af65-642bf2a7b7b4-kube-api-access-sxwt4" (OuterVolumeSpecName: "kube-api-access-sxwt4") pod "58ac3642-4bf7-477b-af65-642bf2a7b7b4" (UID: "58ac3642-4bf7-477b-af65-642bf2a7b7b4"). InnerVolumeSpecName "kube-api-access-sxwt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.590528 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z"] Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.591033 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerName="registry-server" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.591128 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerName="registry-server" Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.591229 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerName="extract-utilities" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.591342 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerName="extract-utilities" Oct 02 11:31:57 crc kubenswrapper[4658]: E1002 11:31:57.591427 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerName="extract-content" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.591487 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerName="extract-content" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.591659 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" containerName="registry-server" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.592211 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.598334 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.598550 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5j4mh" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.598680 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.634225 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58ac3642-4bf7-477b-af65-642bf2a7b7b4" (UID: "58ac3642-4bf7-477b-af65-642bf2a7b7b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.635965 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00a0ddd2-7f0b-4158-a95a-dd16a826ea1e-apiservice-cert\") pod \"metallb-operator-webhook-server-7db6cbc8bb-b4n8z\" (UID: \"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e\") " pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.636016 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kgrb\" (UniqueName: \"kubernetes.io/projected/00a0ddd2-7f0b-4158-a95a-dd16a826ea1e-kube-api-access-8kgrb\") pod \"metallb-operator-webhook-server-7db6cbc8bb-b4n8z\" (UID: \"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e\") " pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.636062 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00a0ddd2-7f0b-4158-a95a-dd16a826ea1e-webhook-cert\") pod \"metallb-operator-webhook-server-7db6cbc8bb-b4n8z\" (UID: \"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e\") " pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.636140 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.636159 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxwt4\" (UniqueName: \"kubernetes.io/projected/58ac3642-4bf7-477b-af65-642bf2a7b7b4-kube-api-access-sxwt4\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.636171 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ac3642-4bf7-477b-af65-642bf2a7b7b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.643842 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z"] Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.738213 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00a0ddd2-7f0b-4158-a95a-dd16a826ea1e-apiservice-cert\") pod \"metallb-operator-webhook-server-7db6cbc8bb-b4n8z\" (UID: \"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e\") " pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.738550 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kgrb\" (UniqueName: \"kubernetes.io/projected/00a0ddd2-7f0b-4158-a95a-dd16a826ea1e-kube-api-access-8kgrb\") pod \"metallb-operator-webhook-server-7db6cbc8bb-b4n8z\" (UID: \"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e\") " pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.738581 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00a0ddd2-7f0b-4158-a95a-dd16a826ea1e-webhook-cert\") pod \"metallb-operator-webhook-server-7db6cbc8bb-b4n8z\" (UID: \"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e\") " pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.744482 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00a0ddd2-7f0b-4158-a95a-dd16a826ea1e-apiservice-cert\") pod \"metallb-operator-webhook-server-7db6cbc8bb-b4n8z\" (UID: \"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e\") " pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.762872 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00a0ddd2-7f0b-4158-a95a-dd16a826ea1e-webhook-cert\") pod \"metallb-operator-webhook-server-7db6cbc8bb-b4n8z\" (UID: \"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e\") " pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.788186 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kgrb\" (UniqueName: \"kubernetes.io/projected/00a0ddd2-7f0b-4158-a95a-dd16a826ea1e-kube-api-access-8kgrb\") pod \"metallb-operator-webhook-server-7db6cbc8bb-b4n8z\" (UID: \"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e\") " pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.916751 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:31:57 crc kubenswrapper[4658]: I1002 11:31:57.927115 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq"] Oct 02 11:31:57 crc kubenswrapper[4658]: W1002 11:31:57.954226 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c60d31_755b_4e0e_888c_072203581d0d.slice/crio-a509721dfea245f2a0cf8b4826488187e8804df1fee2a4ed6761c0f2a84c0527 WatchSource:0}: Error finding container a509721dfea245f2a0cf8b4826488187e8804df1fee2a4ed6761c0f2a84c0527: Status 404 returned error can't find the container with id a509721dfea245f2a0cf8b4826488187e8804df1fee2a4ed6761c0f2a84c0527 Oct 02 11:31:58 crc kubenswrapper[4658]: I1002 11:31:58.380472 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z"] Oct 02 11:31:58 crc kubenswrapper[4658]: W1002 11:31:58.389262 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00a0ddd2_7f0b_4158_a95a_dd16a826ea1e.slice/crio-741ca965b695f2ff13679b667a847e9fc5ca2b75686aa93eae7048f075040dd2 WatchSource:0}: Error finding container 741ca965b695f2ff13679b667a847e9fc5ca2b75686aa93eae7048f075040dd2: Status 404 returned error can't find the container with id 741ca965b695f2ff13679b667a847e9fc5ca2b75686aa93eae7048f075040dd2 Oct 02 11:31:58 crc kubenswrapper[4658]: I1002 11:31:58.430738 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" event={"ID":"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e","Type":"ContainerStarted","Data":"741ca965b695f2ff13679b667a847e9fc5ca2b75686aa93eae7048f075040dd2"} Oct 02 11:31:58 crc kubenswrapper[4658]: I1002 11:31:58.431846 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcss4" Oct 02 11:31:58 crc kubenswrapper[4658]: I1002 11:31:58.433146 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" event={"ID":"f9c60d31-755b-4e0e-888c-072203581d0d","Type":"ContainerStarted","Data":"a509721dfea245f2a0cf8b4826488187e8804df1fee2a4ed6761c0f2a84c0527"} Oct 02 11:31:58 crc kubenswrapper[4658]: I1002 11:31:58.445207 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qcss4"] Oct 02 11:31:58 crc kubenswrapper[4658]: I1002 11:31:58.452957 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qcss4"] Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.302881 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gsbpb"] Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.304283 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.317333 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsbpb"] Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.361599 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-catalog-content\") pod \"certified-operators-gsbpb\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.361654 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hzkh\" (UniqueName: \"kubernetes.io/projected/387eedb6-4716-419d-98b0-87f9a9129cfe-kube-api-access-4hzkh\") pod \"certified-operators-gsbpb\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.361683 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-utilities\") pod \"certified-operators-gsbpb\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.463204 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hzkh\" (UniqueName: \"kubernetes.io/projected/387eedb6-4716-419d-98b0-87f9a9129cfe-kube-api-access-4hzkh\") pod \"certified-operators-gsbpb\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.463313 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-utilities\") pod \"certified-operators-gsbpb\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.463393 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-catalog-content\") pod \"certified-operators-gsbpb\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.463800 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-utilities\") pod \"certified-operators-gsbpb\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.463863 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-catalog-content\") pod \"certified-operators-gsbpb\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.488363 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hzkh\" (UniqueName: \"kubernetes.io/projected/387eedb6-4716-419d-98b0-87f9a9129cfe-kube-api-access-4hzkh\") pod \"certified-operators-gsbpb\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.665753 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:31:59 crc kubenswrapper[4658]: I1002 11:31:59.961127 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ac3642-4bf7-477b-af65-642bf2a7b7b4" path="/var/lib/kubelet/pods/58ac3642-4bf7-477b-af65-642bf2a7b7b4/volumes" Oct 02 11:32:00 crc kubenswrapper[4658]: I1002 11:32:00.220134 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsbpb"] Oct 02 11:32:00 crc kubenswrapper[4658]: I1002 11:32:00.445378 4658 generic.go:334] "Generic (PLEG): container finished" podID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerID="f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40" exitCode=0 Oct 02 11:32:00 crc kubenswrapper[4658]: I1002 11:32:00.445427 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsbpb" event={"ID":"387eedb6-4716-419d-98b0-87f9a9129cfe","Type":"ContainerDied","Data":"f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40"} Oct 02 11:32:00 crc kubenswrapper[4658]: I1002 11:32:00.445461 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsbpb" event={"ID":"387eedb6-4716-419d-98b0-87f9a9129cfe","Type":"ContainerStarted","Data":"0ed2b462f59d53bdbaed26d828bdd2c9201a2c3e0de8ebcfd063d4265b78ae54"} Oct 02 11:32:04 crc kubenswrapper[4658]: I1002 11:32:04.471442 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" event={"ID":"f9c60d31-755b-4e0e-888c-072203581d0d","Type":"ContainerStarted","Data":"c625f862de7c206784f8b2495d5ee6274d1d839f3ec91b395b14d7a923f51b8b"} Oct 02 11:32:04 crc kubenswrapper[4658]: I1002 11:32:04.472078 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:32:04 crc kubenswrapper[4658]: I1002 11:32:04.473262 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" event={"ID":"00a0ddd2-7f0b-4158-a95a-dd16a826ea1e","Type":"ContainerStarted","Data":"a2abfbac0b1e86519c332aba1d3fce20e127b2130031f1f1b74a71cc53d9b931"} Oct 02 11:32:04 crc kubenswrapper[4658]: I1002 11:32:04.473399 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:32:04 crc kubenswrapper[4658]: I1002 11:32:04.475026 4658 generic.go:334] "Generic (PLEG): container finished" podID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerID="82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f" exitCode=0 Oct 02 11:32:04 crc kubenswrapper[4658]: I1002 11:32:04.475080 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsbpb" event={"ID":"387eedb6-4716-419d-98b0-87f9a9129cfe","Type":"ContainerDied","Data":"82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f"} Oct 02 11:32:04 crc kubenswrapper[4658]: I1002 11:32:04.493556 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" podStartSLOduration=1.842840324 podStartE2EDuration="7.493534696s" podCreationTimestamp="2025-10-02 11:31:57 +0000 UTC" firstStartedPulling="2025-10-02 11:31:57.955901372 +0000 UTC m=+798.847054939" lastFinishedPulling="2025-10-02 11:32:03.606595744 +0000 UTC m=+804.497749311" observedRunningTime="2025-10-02 11:32:04.488418042 +0000 UTC m=+805.379571629" watchObservedRunningTime="2025-10-02 11:32:04.493534696 +0000 UTC m=+805.384688263" Oct 02 11:32:04 crc kubenswrapper[4658]: I1002 11:32:04.512418 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" podStartSLOduration=2.294782171 podStartE2EDuration="7.51239614s" podCreationTimestamp="2025-10-02 11:31:57 +0000 UTC" firstStartedPulling="2025-10-02 11:31:58.391922989 +0000 UTC m=+799.283076546" lastFinishedPulling="2025-10-02 11:32:03.609536948 +0000 UTC m=+804.500690515" observedRunningTime="2025-10-02 11:32:04.5073843 +0000 UTC m=+805.398537877" watchObservedRunningTime="2025-10-02 11:32:04.51239614 +0000 UTC m=+805.403549707" Oct 02 11:32:05 crc kubenswrapper[4658]: I1002 11:32:05.482567 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsbpb" event={"ID":"387eedb6-4716-419d-98b0-87f9a9129cfe","Type":"ContainerStarted","Data":"58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef"} Oct 02 11:32:05 crc kubenswrapper[4658]: I1002 11:32:05.503820 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gsbpb" podStartSLOduration=1.995121878 podStartE2EDuration="6.503795401s" podCreationTimestamp="2025-10-02 11:31:59 +0000 UTC" firstStartedPulling="2025-10-02 11:32:00.469057075 +0000 UTC m=+801.360210642" lastFinishedPulling="2025-10-02 11:32:04.977730587 +0000 UTC m=+805.868884165" observedRunningTime="2025-10-02 11:32:05.501798237 +0000 UTC m=+806.392951804" watchObservedRunningTime="2025-10-02 11:32:05.503795401 +0000 UTC m=+806.394948968" Oct 02 11:32:09 crc kubenswrapper[4658]: I1002 11:32:09.666494 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:32:09 crc kubenswrapper[4658]: I1002 11:32:09.667997 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:32:09 crc kubenswrapper[4658]: I1002 11:32:09.762983 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:32:10 crc kubenswrapper[4658]: I1002 11:32:10.563134 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:32:12 crc kubenswrapper[4658]: I1002 11:32:12.094275 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsbpb"] Oct 02 11:32:13 crc kubenswrapper[4658]: I1002 11:32:13.532472 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gsbpb" podUID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerName="registry-server" containerID="cri-o://58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef" gracePeriod=2 Oct 02 11:32:13 crc kubenswrapper[4658]: I1002 11:32:13.903557 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:32:13 crc kubenswrapper[4658]: I1002 11:32:13.976961 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-catalog-content\") pod \"387eedb6-4716-419d-98b0-87f9a9129cfe\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " Oct 02 11:32:13 crc kubenswrapper[4658]: I1002 11:32:13.977042 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hzkh\" (UniqueName: \"kubernetes.io/projected/387eedb6-4716-419d-98b0-87f9a9129cfe-kube-api-access-4hzkh\") pod \"387eedb6-4716-419d-98b0-87f9a9129cfe\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " Oct 02 11:32:13 crc kubenswrapper[4658]: I1002 11:32:13.977066 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-utilities\") pod \"387eedb6-4716-419d-98b0-87f9a9129cfe\" (UID: \"387eedb6-4716-419d-98b0-87f9a9129cfe\") " Oct 02 11:32:13 crc kubenswrapper[4658]: I1002 11:32:13.978002 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-utilities" (OuterVolumeSpecName: "utilities") pod "387eedb6-4716-419d-98b0-87f9a9129cfe" (UID: "387eedb6-4716-419d-98b0-87f9a9129cfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:13 crc kubenswrapper[4658]: I1002 11:32:13.982710 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387eedb6-4716-419d-98b0-87f9a9129cfe-kube-api-access-4hzkh" (OuterVolumeSpecName: "kube-api-access-4hzkh") pod "387eedb6-4716-419d-98b0-87f9a9129cfe" (UID: "387eedb6-4716-419d-98b0-87f9a9129cfe"). InnerVolumeSpecName "kube-api-access-4hzkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.044760 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "387eedb6-4716-419d-98b0-87f9a9129cfe" (UID: "387eedb6-4716-419d-98b0-87f9a9129cfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.079018 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.079056 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hzkh\" (UniqueName: \"kubernetes.io/projected/387eedb6-4716-419d-98b0-87f9a9129cfe-kube-api-access-4hzkh\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.079068 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/387eedb6-4716-419d-98b0-87f9a9129cfe-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.544324 4658 generic.go:334] "Generic (PLEG): container finished" podID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerID="58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef" exitCode=0 Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.544373 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsbpb" event={"ID":"387eedb6-4716-419d-98b0-87f9a9129cfe","Type":"ContainerDied","Data":"58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef"} Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.544382 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsbpb" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.544409 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsbpb" event={"ID":"387eedb6-4716-419d-98b0-87f9a9129cfe","Type":"ContainerDied","Data":"0ed2b462f59d53bdbaed26d828bdd2c9201a2c3e0de8ebcfd063d4265b78ae54"} Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.544430 4658 scope.go:117] "RemoveContainer" containerID="58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.564797 4658 scope.go:117] "RemoveContainer" containerID="82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.582640 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsbpb"] Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.589544 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gsbpb"] Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.608949 4658 scope.go:117] "RemoveContainer" containerID="f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.629577 4658 scope.go:117] "RemoveContainer" containerID="58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef" Oct 02 11:32:14 crc kubenswrapper[4658]: E1002 11:32:14.631046 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef\": container with ID starting with 58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef not found: ID does not exist" containerID="58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.631087 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef"} err="failed to get container status \"58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef\": rpc error: code = NotFound desc = could not find container \"58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef\": container with ID starting with 58a9d34c5504822ff85feb0979ba37dac669e8ef4c115361238c629ce78a52ef not found: ID does not exist" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.631111 4658 scope.go:117] "RemoveContainer" containerID="82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f" Oct 02 11:32:14 crc kubenswrapper[4658]: E1002 11:32:14.632355 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f\": container with ID starting with 82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f not found: ID does not exist" containerID="82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.632385 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f"} err="failed to get container status \"82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f\": rpc error: code = NotFound desc = could not find container \"82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f\": container with ID starting with 82e4cd5b8702eced1fcf89005cc08e62e95622d3fbd612241312bc8164f9ee1f not found: ID does not exist" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.632405 4658 scope.go:117] "RemoveContainer" containerID="f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40" Oct 02 11:32:14 crc kubenswrapper[4658]: E1002 11:32:14.632790 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40\": container with ID starting with f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40 not found: ID does not exist" containerID="f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40" Oct 02 11:32:14 crc kubenswrapper[4658]: I1002 11:32:14.632818 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40"} err="failed to get container status \"f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40\": rpc error: code = NotFound desc = could not find container \"f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40\": container with ID starting with f7e80ecc2928ff64750b06383c77fdb61de0bd2e70816ce7ac42f2b5a37a0e40 not found: ID does not exist" Oct 02 11:32:15 crc kubenswrapper[4658]: I1002 11:32:15.957401 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="387eedb6-4716-419d-98b0-87f9a9129cfe" path="/var/lib/kubelet/pods/387eedb6-4716-419d-98b0-87f9a9129cfe/volumes" Oct 02 11:32:17 crc kubenswrapper[4658]: I1002 11:32:17.922367 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7db6cbc8bb-b4n8z" Oct 02 11:32:37 crc kubenswrapper[4658]: I1002 11:32:37.532239 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5c6495c478-cxldq" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.406205 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jzr5d"] Oct 02 11:32:38 crc kubenswrapper[4658]: E1002 11:32:38.406763 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerName="registry-server" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.406859 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerName="registry-server" Oct 02 11:32:38 crc kubenswrapper[4658]: E1002 11:32:38.406946 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerName="extract-content" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.407026 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerName="extract-content" Oct 02 11:32:38 crc kubenswrapper[4658]: E1002 11:32:38.407123 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerName="extract-utilities" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.407195 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerName="extract-utilities" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.407432 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="387eedb6-4716-419d-98b0-87f9a9129cfe" containerName="registry-server" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.410072 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.412863 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.416483 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd"] Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.413040 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qshtq" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.415648 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.431471 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd"] Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.431619 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.434058 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.495522 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mrv9d"] Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.495976 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-frr-sockets\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.496030 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66mxz\" (UniqueName: \"kubernetes.io/projected/0d17ce7e-0727-401c-b54e-8b6e6729d22a-kube-api-access-66mxz\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.496078 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-frr-conf\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.496124 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-metrics\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.496144 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c490f6-26be-4b3c-93f7-65b1625425a1-cert\") pod \"frr-k8s-webhook-server-64bf5d555-k4bcd\" (UID: \"74c490f6-26be-4b3c-93f7-65b1625425a1\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.496178 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d17ce7e-0727-401c-b54e-8b6e6729d22a-metrics-certs\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.496197 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0d17ce7e-0727-401c-b54e-8b6e6729d22a-frr-startup\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.496223 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm7vq\" (UniqueName: \"kubernetes.io/projected/74c490f6-26be-4b3c-93f7-65b1625425a1-kube-api-access-rm7vq\") pod \"frr-k8s-webhook-server-64bf5d555-k4bcd\" (UID: \"74c490f6-26be-4b3c-93f7-65b1625425a1\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.496253 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-reloader\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.496447 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.498088 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.498205 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hzffw" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.498502 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.500201 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.524657 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-bsjg9"] Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.525835 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.528043 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.536013 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-bsjg9"] Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597161 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-metrics\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597222 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-metrics-certs\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597249 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c490f6-26be-4b3c-93f7-65b1625425a1-cert\") pod \"frr-k8s-webhook-server-64bf5d555-k4bcd\" (UID: \"74c490f6-26be-4b3c-93f7-65b1625425a1\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597276 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41-cert\") pod \"controller-68d546b9d8-bsjg9\" (UID: \"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41\") " pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597365 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d17ce7e-0727-401c-b54e-8b6e6729d22a-metrics-certs\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597388 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41-metrics-certs\") pod \"controller-68d546b9d8-bsjg9\" (UID: \"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41\") " pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: E1002 11:32:38.597391 4658 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597442 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0d17ce7e-0727-401c-b54e-8b6e6729d22a-frr-startup\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: E1002 11:32:38.597511 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74c490f6-26be-4b3c-93f7-65b1625425a1-cert podName:74c490f6-26be-4b3c-93f7-65b1625425a1 nodeName:}" failed. No retries permitted until 2025-10-02 11:32:39.097494468 +0000 UTC m=+839.988648035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74c490f6-26be-4b3c-93f7-65b1625425a1-cert") pod "frr-k8s-webhook-server-64bf5d555-k4bcd" (UID: "74c490f6-26be-4b3c-93f7-65b1625425a1") : secret "frr-k8s-webhook-server-cert" not found Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597527 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm7vq\" (UniqueName: \"kubernetes.io/projected/74c490f6-26be-4b3c-93f7-65b1625425a1-kube-api-access-rm7vq\") pod \"frr-k8s-webhook-server-64bf5d555-k4bcd\" (UID: \"74c490f6-26be-4b3c-93f7-65b1625425a1\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597570 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-reloader\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597655 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f053e253-c411-41a9-b81b-d7cf91cc9b8b-metallb-excludel2\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597677 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-frr-sockets\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597730 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-metrics\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597753 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-memberlist\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597794 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snqbz\" (UniqueName: \"kubernetes.io/projected/9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41-kube-api-access-snqbz\") pod \"controller-68d546b9d8-bsjg9\" (UID: \"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41\") " pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597819 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcc2v\" (UniqueName: \"kubernetes.io/projected/f053e253-c411-41a9-b81b-d7cf91cc9b8b-kube-api-access-pcc2v\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597880 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66mxz\" (UniqueName: \"kubernetes.io/projected/0d17ce7e-0727-401c-b54e-8b6e6729d22a-kube-api-access-66mxz\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597910 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-frr-conf\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.597957 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-frr-sockets\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.598039 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-reloader\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.598170 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0d17ce7e-0727-401c-b54e-8b6e6729d22a-frr-conf\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.598289 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0d17ce7e-0727-401c-b54e-8b6e6729d22a-frr-startup\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.603745 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d17ce7e-0727-401c-b54e-8b6e6729d22a-metrics-certs\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.612309 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm7vq\" (UniqueName: \"kubernetes.io/projected/74c490f6-26be-4b3c-93f7-65b1625425a1-kube-api-access-rm7vq\") pod \"frr-k8s-webhook-server-64bf5d555-k4bcd\" (UID: \"74c490f6-26be-4b3c-93f7-65b1625425a1\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.614655 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66mxz\" (UniqueName: \"kubernetes.io/projected/0d17ce7e-0727-401c-b54e-8b6e6729d22a-kube-api-access-66mxz\") pod \"frr-k8s-jzr5d\" (UID: \"0d17ce7e-0727-401c-b54e-8b6e6729d22a\") " pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.698547 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f053e253-c411-41a9-b81b-d7cf91cc9b8b-metallb-excludel2\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.698602 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-memberlist\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.698621 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snqbz\" (UniqueName: \"kubernetes.io/projected/9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41-kube-api-access-snqbz\") pod \"controller-68d546b9d8-bsjg9\" (UID: \"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41\") " pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.698639 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcc2v\" (UniqueName: \"kubernetes.io/projected/f053e253-c411-41a9-b81b-d7cf91cc9b8b-kube-api-access-pcc2v\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.698670 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-metrics-certs\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.698698 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41-cert\") pod \"controller-68d546b9d8-bsjg9\" (UID: \"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41\") " pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.698720 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41-metrics-certs\") pod \"controller-68d546b9d8-bsjg9\" (UID: \"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41\") " pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: E1002 11:32:38.699499 4658 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 11:32:38 crc kubenswrapper[4658]: E1002 11:32:38.699610 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-memberlist podName:f053e253-c411-41a9-b81b-d7cf91cc9b8b nodeName:}" failed. No retries permitted until 2025-10-02 11:32:39.199585831 +0000 UTC m=+840.090739468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-memberlist") pod "speaker-mrv9d" (UID: "f053e253-c411-41a9-b81b-d7cf91cc9b8b") : secret "metallb-memberlist" not found Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.699991 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f053e253-c411-41a9-b81b-d7cf91cc9b8b-metallb-excludel2\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.702286 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41-metrics-certs\") pod \"controller-68d546b9d8-bsjg9\" (UID: \"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41\") " pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.702745 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-metrics-certs\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.709763 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41-cert\") pod \"controller-68d546b9d8-bsjg9\" (UID: \"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41\") " pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.717959 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snqbz\" (UniqueName: \"kubernetes.io/projected/9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41-kube-api-access-snqbz\") pod \"controller-68d546b9d8-bsjg9\" (UID: \"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41\") " pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.719726 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcc2v\" (UniqueName: \"kubernetes.io/projected/f053e253-c411-41a9-b81b-d7cf91cc9b8b-kube-api-access-pcc2v\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.730134 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:38 crc kubenswrapper[4658]: I1002 11:32:38.839245 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:39 crc kubenswrapper[4658]: I1002 11:32:39.105146 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c490f6-26be-4b3c-93f7-65b1625425a1-cert\") pod \"frr-k8s-webhook-server-64bf5d555-k4bcd\" (UID: \"74c490f6-26be-4b3c-93f7-65b1625425a1\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:39 crc kubenswrapper[4658]: I1002 11:32:39.111039 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c490f6-26be-4b3c-93f7-65b1625425a1-cert\") pod \"frr-k8s-webhook-server-64bf5d555-k4bcd\" (UID: \"74c490f6-26be-4b3c-93f7-65b1625425a1\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:39 crc kubenswrapper[4658]: I1002 11:32:39.206879 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-memberlist\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:39 crc kubenswrapper[4658]: E1002 11:32:39.207066 4658 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 11:32:39 crc kubenswrapper[4658]: E1002 11:32:39.207152 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-memberlist podName:f053e253-c411-41a9-b81b-d7cf91cc9b8b nodeName:}" failed. No retries permitted until 2025-10-02 11:32:40.20713524 +0000 UTC m=+841.098288807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-memberlist") pod "speaker-mrv9d" (UID: "f053e253-c411-41a9-b81b-d7cf91cc9b8b") : secret "metallb-memberlist" not found Oct 02 11:32:39 crc kubenswrapper[4658]: I1002 11:32:39.320242 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-bsjg9"] Oct 02 11:32:39 crc kubenswrapper[4658]: W1002 11:32:39.325720 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9686fc5d_61b7_47a1_b0b0_0bcdd8b31d41.slice/crio-0663efdda2f448e75348312460c1cfcd03457945fea5dac84f3190d7c3f40cd3 WatchSource:0}: Error finding container 0663efdda2f448e75348312460c1cfcd03457945fea5dac84f3190d7c3f40cd3: Status 404 returned error can't find the container with id 0663efdda2f448e75348312460c1cfcd03457945fea5dac84f3190d7c3f40cd3 Oct 02 11:32:39 crc kubenswrapper[4658]: I1002 11:32:39.349311 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:39 crc kubenswrapper[4658]: I1002 11:32:39.689891 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-bsjg9" event={"ID":"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41","Type":"ContainerStarted","Data":"a3c3205f5e8defd3a8ee4ce8e3a1e1156059168bdd5a6b5affd525b80004e348"} Oct 02 11:32:39 crc kubenswrapper[4658]: I1002 11:32:39.690188 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-bsjg9" event={"ID":"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41","Type":"ContainerStarted","Data":"0663efdda2f448e75348312460c1cfcd03457945fea5dac84f3190d7c3f40cd3"} Oct 02 11:32:39 crc kubenswrapper[4658]: I1002 11:32:39.691007 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerStarted","Data":"13bd005f2caa899e1f850319b23e70016a7d8b9d1fb74f7903af9b363e95eaac"} Oct 02 11:32:39 crc kubenswrapper[4658]: I1002 11:32:39.748164 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd"] Oct 02 11:32:40 crc kubenswrapper[4658]: I1002 11:32:40.235606 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-memberlist\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:40 crc kubenswrapper[4658]: I1002 11:32:40.261028 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f053e253-c411-41a9-b81b-d7cf91cc9b8b-memberlist\") pod \"speaker-mrv9d\" (UID: \"f053e253-c411-41a9-b81b-d7cf91cc9b8b\") " pod="metallb-system/speaker-mrv9d" Oct 02 11:32:40 crc kubenswrapper[4658]: I1002 11:32:40.327790 4658 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hzffw" Oct 02 11:32:40 crc kubenswrapper[4658]: I1002 11:32:40.330595 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mrv9d" Oct 02 11:32:40 crc kubenswrapper[4658]: W1002 11:32:40.370515 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf053e253_c411_41a9_b81b_d7cf91cc9b8b.slice/crio-25ca496be7029180ef068d03581e77d59a14b1bf3ed7143143a077fb3d183148 WatchSource:0}: Error finding container 25ca496be7029180ef068d03581e77d59a14b1bf3ed7143143a077fb3d183148: Status 404 returned error can't find the container with id 25ca496be7029180ef068d03581e77d59a14b1bf3ed7143143a077fb3d183148 Oct 02 11:32:40 crc kubenswrapper[4658]: I1002 11:32:40.703872 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" event={"ID":"74c490f6-26be-4b3c-93f7-65b1625425a1","Type":"ContainerStarted","Data":"ca490cbb75a5835ad6174140f31f719b8dbacae44b6d6600142b07132b7d32cb"} Oct 02 11:32:40 crc kubenswrapper[4658]: I1002 11:32:40.707139 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-bsjg9" event={"ID":"9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41","Type":"ContainerStarted","Data":"dbe4b4c0ba79422a16f8632817e9d3fcae83c02a9f19e262b02c2a3fd54af265"} Oct 02 11:32:40 crc kubenswrapper[4658]: I1002 11:32:40.708056 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:40 crc kubenswrapper[4658]: I1002 11:32:40.710544 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mrv9d" event={"ID":"f053e253-c411-41a9-b81b-d7cf91cc9b8b","Type":"ContainerStarted","Data":"25ca496be7029180ef068d03581e77d59a14b1bf3ed7143143a077fb3d183148"} Oct 02 11:32:40 crc kubenswrapper[4658]: I1002 11:32:40.733447 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-bsjg9" podStartSLOduration=2.733427738 podStartE2EDuration="2.733427738s" podCreationTimestamp="2025-10-02 11:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:40.728823911 +0000 UTC m=+841.619977478" watchObservedRunningTime="2025-10-02 11:32:40.733427738 +0000 UTC m=+841.624581305" Oct 02 11:32:41 crc kubenswrapper[4658]: I1002 11:32:41.727908 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mrv9d" event={"ID":"f053e253-c411-41a9-b81b-d7cf91cc9b8b","Type":"ContainerStarted","Data":"6ebd5484268db6ddbbd481d683af39892c8b9b4f445217927337e0e6b55ea5a4"} Oct 02 11:32:41 crc kubenswrapper[4658]: I1002 11:32:41.727964 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mrv9d" event={"ID":"f053e253-c411-41a9-b81b-d7cf91cc9b8b","Type":"ContainerStarted","Data":"7eaa10dbce92672127d641522092ab8c30b7effbbd31481457125e89704e9cbb"} Oct 02 11:32:41 crc kubenswrapper[4658]: I1002 11:32:41.727999 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mrv9d" Oct 02 11:32:41 crc kubenswrapper[4658]: I1002 11:32:41.754490 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mrv9d" podStartSLOduration=3.754475229 podStartE2EDuration="3.754475229s" podCreationTimestamp="2025-10-02 11:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:41.751482964 +0000 UTC m=+842.642636531" watchObservedRunningTime="2025-10-02 11:32:41.754475229 +0000 UTC m=+842.645628796" Oct 02 11:32:46 crc kubenswrapper[4658]: I1002 11:32:46.785285 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" event={"ID":"74c490f6-26be-4b3c-93f7-65b1625425a1","Type":"ContainerStarted","Data":"d1fb1c2c4f4540e3391ce819b56bdab0defa74e3312fbe3fd351d72ee619989e"} Oct 02 11:32:46 crc kubenswrapper[4658]: I1002 11:32:46.786233 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:46 crc kubenswrapper[4658]: I1002 11:32:46.789256 4658 generic.go:334] "Generic (PLEG): container finished" podID="0d17ce7e-0727-401c-b54e-8b6e6729d22a" containerID="a742cc3f42468d807f60623f7abf8bf44bbb95b50444a26cd130fe888a17c27a" exitCode=0 Oct 02 11:32:46 crc kubenswrapper[4658]: I1002 11:32:46.789340 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerDied","Data":"a742cc3f42468d807f60623f7abf8bf44bbb95b50444a26cd130fe888a17c27a"} Oct 02 11:32:46 crc kubenswrapper[4658]: I1002 11:32:46.811012 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" podStartSLOduration=2.004466659 podStartE2EDuration="8.810993943s" podCreationTimestamp="2025-10-02 11:32:38 +0000 UTC" firstStartedPulling="2025-10-02 11:32:39.759497587 +0000 UTC m=+840.650651154" lastFinishedPulling="2025-10-02 11:32:46.566024871 +0000 UTC m=+847.457178438" observedRunningTime="2025-10-02 11:32:46.810962002 +0000 UTC m=+847.702115579" watchObservedRunningTime="2025-10-02 11:32:46.810993943 +0000 UTC m=+847.702147510" Oct 02 11:32:47 crc kubenswrapper[4658]: I1002 11:32:47.799487 4658 generic.go:334] "Generic (PLEG): container finished" podID="0d17ce7e-0727-401c-b54e-8b6e6729d22a" containerID="e8f8544bae16dad943a1c814d0e3f449dd45b894bed413d9781406115212a65b" exitCode=0 Oct 02 11:32:47 crc kubenswrapper[4658]: I1002 11:32:47.799612 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerDied","Data":"e8f8544bae16dad943a1c814d0e3f449dd45b894bed413d9781406115212a65b"} Oct 02 11:32:48 crc kubenswrapper[4658]: I1002 11:32:48.808536 4658 generic.go:334] "Generic (PLEG): container finished" podID="0d17ce7e-0727-401c-b54e-8b6e6729d22a" containerID="457a9ebb6e5e4d3c1d17343edaa9a9280e0c28ac2be87c29b1ec6db38f5e7e1e" exitCode=0 Oct 02 11:32:48 crc kubenswrapper[4658]: I1002 11:32:48.808614 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerDied","Data":"457a9ebb6e5e4d3c1d17343edaa9a9280e0c28ac2be87c29b1ec6db38f5e7e1e"} Oct 02 11:32:49 crc kubenswrapper[4658]: I1002 11:32:49.821361 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerStarted","Data":"4cad5a52aa1ce47c56fe6df5df8cf50387d6f14df33a635519b81aa124f48435"} Oct 02 11:32:49 crc kubenswrapper[4658]: I1002 11:32:49.821674 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerStarted","Data":"68c1c0399602a11622ea10f5506c7f82eb64bd465c7e59b05dc5d75e35ea3082"} Oct 02 11:32:49 crc kubenswrapper[4658]: I1002 11:32:49.821688 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerStarted","Data":"93a1f0da084109abcadb2a3a6bb1f3ce7271de32b86462df3e8e67c4456a1613"} Oct 02 11:32:49 crc kubenswrapper[4658]: I1002 11:32:49.821701 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerStarted","Data":"19e32ed45f0957ec8a4afccb8015bed002dfe3a3414441ac0f48b26f561a68de"} Oct 02 11:32:49 crc kubenswrapper[4658]: I1002 11:32:49.821713 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerStarted","Data":"5aa5f4d37f8b5d06307437087ec68cc478e28bf2b2310de34e9dd38fdca4208b"} Oct 02 11:32:50 crc kubenswrapper[4658]: I1002 11:32:50.337583 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mrv9d" Oct 02 11:32:50 crc kubenswrapper[4658]: I1002 11:32:50.830123 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jzr5d" event={"ID":"0d17ce7e-0727-401c-b54e-8b6e6729d22a","Type":"ContainerStarted","Data":"d6ec88c6cc0cb34a0e53f7a8b28fc27865e88c6139ac1f05276a063306911d98"} Oct 02 11:32:50 crc kubenswrapper[4658]: I1002 11:32:50.830793 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:50 crc kubenswrapper[4658]: I1002 11:32:50.853030 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jzr5d" podStartSLOduration=5.132688129 podStartE2EDuration="12.853013126s" podCreationTimestamp="2025-10-02 11:32:38 +0000 UTC" firstStartedPulling="2025-10-02 11:32:38.863956179 +0000 UTC m=+839.755109746" lastFinishedPulling="2025-10-02 11:32:46.584281176 +0000 UTC m=+847.475434743" observedRunningTime="2025-10-02 11:32:50.848935276 +0000 UTC m=+851.740088833" watchObservedRunningTime="2025-10-02 11:32:50.853013126 +0000 UTC m=+851.744166693" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.426851 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-28gz8"] Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.428219 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-28gz8" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.431438 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gvb6\" (UniqueName: \"kubernetes.io/projected/5f58d62d-8e0e-4c04-a027-8acc49d8edf4-kube-api-access-5gvb6\") pod \"openstack-operator-index-28gz8\" (UID: \"5f58d62d-8e0e-4c04-a027-8acc49d8edf4\") " pod="openstack-operators/openstack-operator-index-28gz8" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.432439 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.432509 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gk4q9" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.432987 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.440373 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-28gz8"] Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.532452 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gvb6\" (UniqueName: \"kubernetes.io/projected/5f58d62d-8e0e-4c04-a027-8acc49d8edf4-kube-api-access-5gvb6\") pod \"openstack-operator-index-28gz8\" (UID: \"5f58d62d-8e0e-4c04-a027-8acc49d8edf4\") " pod="openstack-operators/openstack-operator-index-28gz8" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.550133 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gvb6\" (UniqueName: \"kubernetes.io/projected/5f58d62d-8e0e-4c04-a027-8acc49d8edf4-kube-api-access-5gvb6\") pod \"openstack-operator-index-28gz8\" (UID: \"5f58d62d-8e0e-4c04-a027-8acc49d8edf4\") " pod="openstack-operators/openstack-operator-index-28gz8" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.730889 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.749217 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-28gz8" Oct 02 11:32:53 crc kubenswrapper[4658]: I1002 11:32:53.783024 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:54 crc kubenswrapper[4658]: I1002 11:32:54.156654 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-28gz8"] Oct 02 11:32:54 crc kubenswrapper[4658]: W1002 11:32:54.171503 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f58d62d_8e0e_4c04_a027_8acc49d8edf4.slice/crio-ebb15d35f9078818690bc30f70b56da6118c006e14a56a87f22dced0291cc9b9 WatchSource:0}: Error finding container ebb15d35f9078818690bc30f70b56da6118c006e14a56a87f22dced0291cc9b9: Status 404 returned error can't find the container with id ebb15d35f9078818690bc30f70b56da6118c006e14a56a87f22dced0291cc9b9 Oct 02 11:32:54 crc kubenswrapper[4658]: I1002 11:32:54.855466 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-28gz8" event={"ID":"5f58d62d-8e0e-4c04-a027-8acc49d8edf4","Type":"ContainerStarted","Data":"ebb15d35f9078818690bc30f70b56da6118c006e14a56a87f22dced0291cc9b9"} Oct 02 11:32:57 crc kubenswrapper[4658]: I1002 11:32:57.399424 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-28gz8"] Oct 02 11:32:58 crc kubenswrapper[4658]: I1002 11:32:58.008238 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4jglm"] Oct 02 11:32:58 crc kubenswrapper[4658]: I1002 11:32:58.009010 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4jglm" Oct 02 11:32:58 crc kubenswrapper[4658]: I1002 11:32:58.018076 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4jglm"] Oct 02 11:32:58 crc kubenswrapper[4658]: I1002 11:32:58.101688 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdx4s\" (UniqueName: \"kubernetes.io/projected/71959757-609a-415a-9717-711c3f8ad66d-kube-api-access-rdx4s\") pod \"openstack-operator-index-4jglm\" (UID: \"71959757-609a-415a-9717-711c3f8ad66d\") " pod="openstack-operators/openstack-operator-index-4jglm" Oct 02 11:32:58 crc kubenswrapper[4658]: I1002 11:32:58.205575 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdx4s\" (UniqueName: \"kubernetes.io/projected/71959757-609a-415a-9717-711c3f8ad66d-kube-api-access-rdx4s\") pod \"openstack-operator-index-4jglm\" (UID: \"71959757-609a-415a-9717-711c3f8ad66d\") " pod="openstack-operators/openstack-operator-index-4jglm" Oct 02 11:32:58 crc kubenswrapper[4658]: I1002 11:32:58.224121 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdx4s\" (UniqueName: \"kubernetes.io/projected/71959757-609a-415a-9717-711c3f8ad66d-kube-api-access-rdx4s\") pod \"openstack-operator-index-4jglm\" (UID: \"71959757-609a-415a-9717-711c3f8ad66d\") " pod="openstack-operators/openstack-operator-index-4jglm" Oct 02 11:32:58 crc kubenswrapper[4658]: I1002 11:32:58.339652 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4jglm" Oct 02 11:32:58 crc kubenswrapper[4658]: I1002 11:32:58.733153 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jzr5d" Oct 02 11:32:58 crc kubenswrapper[4658]: I1002 11:32:58.843086 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-bsjg9" Oct 02 11:32:59 crc kubenswrapper[4658]: I1002 11:32:59.362351 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-k4bcd" Oct 02 11:32:59 crc kubenswrapper[4658]: I1002 11:32:59.818834 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4jglm"] Oct 02 11:32:59 crc kubenswrapper[4658]: W1002 11:32:59.824522 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71959757_609a_415a_9717_711c3f8ad66d.slice/crio-2d55094099bf34c81c5d363cd6fbde46a94dabb43f69d71c21bbad90fa330f45 WatchSource:0}: Error finding container 2d55094099bf34c81c5d363cd6fbde46a94dabb43f69d71c21bbad90fa330f45: Status 404 returned error can't find the container with id 2d55094099bf34c81c5d363cd6fbde46a94dabb43f69d71c21bbad90fa330f45 Oct 02 11:32:59 crc kubenswrapper[4658]: I1002 11:32:59.892681 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-28gz8" event={"ID":"5f58d62d-8e0e-4c04-a027-8acc49d8edf4","Type":"ContainerStarted","Data":"06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e"} Oct 02 11:32:59 crc kubenswrapper[4658]: I1002 11:32:59.892745 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-28gz8" podUID="5f58d62d-8e0e-4c04-a027-8acc49d8edf4" containerName="registry-server" containerID="cri-o://06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e" gracePeriod=2 Oct 02 11:32:59 crc kubenswrapper[4658]: I1002 11:32:59.894111 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4jglm" event={"ID":"71959757-609a-415a-9717-711c3f8ad66d","Type":"ContainerStarted","Data":"2d55094099bf34c81c5d363cd6fbde46a94dabb43f69d71c21bbad90fa330f45"} Oct 02 11:32:59 crc kubenswrapper[4658]: I1002 11:32:59.909172 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-28gz8" podStartSLOduration=1.58441937 podStartE2EDuration="6.909153123s" podCreationTimestamp="2025-10-02 11:32:53 +0000 UTC" firstStartedPulling="2025-10-02 11:32:54.173030143 +0000 UTC m=+855.064183710" lastFinishedPulling="2025-10-02 11:32:59.497763896 +0000 UTC m=+860.388917463" observedRunningTime="2025-10-02 11:32:59.907184019 +0000 UTC m=+860.798337596" watchObservedRunningTime="2025-10-02 11:32:59.909153123 +0000 UTC m=+860.800306690" Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.249001 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-28gz8" Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.342034 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gvb6\" (UniqueName: \"kubernetes.io/projected/5f58d62d-8e0e-4c04-a027-8acc49d8edf4-kube-api-access-5gvb6\") pod \"5f58d62d-8e0e-4c04-a027-8acc49d8edf4\" (UID: \"5f58d62d-8e0e-4c04-a027-8acc49d8edf4\") " Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.349539 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f58d62d-8e0e-4c04-a027-8acc49d8edf4-kube-api-access-5gvb6" (OuterVolumeSpecName: "kube-api-access-5gvb6") pod "5f58d62d-8e0e-4c04-a027-8acc49d8edf4" (UID: "5f58d62d-8e0e-4c04-a027-8acc49d8edf4"). InnerVolumeSpecName "kube-api-access-5gvb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.443257 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gvb6\" (UniqueName: \"kubernetes.io/projected/5f58d62d-8e0e-4c04-a027-8acc49d8edf4-kube-api-access-5gvb6\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.900737 4658 generic.go:334] "Generic (PLEG): container finished" podID="5f58d62d-8e0e-4c04-a027-8acc49d8edf4" containerID="06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e" exitCode=0 Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.900787 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-28gz8" Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.900815 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-28gz8" event={"ID":"5f58d62d-8e0e-4c04-a027-8acc49d8edf4","Type":"ContainerDied","Data":"06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e"} Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.900869 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-28gz8" event={"ID":"5f58d62d-8e0e-4c04-a027-8acc49d8edf4","Type":"ContainerDied","Data":"ebb15d35f9078818690bc30f70b56da6118c006e14a56a87f22dced0291cc9b9"} Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.900891 4658 scope.go:117] "RemoveContainer" containerID="06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e" Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.901876 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4jglm" event={"ID":"71959757-609a-415a-9717-711c3f8ad66d","Type":"ContainerStarted","Data":"86ac391c2d9835ef2a45c2b01dda73c12d995eb7ae8b8aa35553c6e191f80573"} Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.916882 4658 scope.go:117] "RemoveContainer" containerID="06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e" Oct 02 11:33:00 crc kubenswrapper[4658]: E1002 11:33:00.917594 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e\": container with ID starting with 06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e not found: ID does not exist" containerID="06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e" Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.917665 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e"} err="failed to get container status \"06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e\": rpc error: code = NotFound desc = could not find container \"06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e\": container with ID starting with 06eb35d35fbb90b6f7cfa1fb12af070c1a1fe076c3ef702903933bd1bcf4485e not found: ID does not exist" Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.931386 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4jglm" podStartSLOduration=3.888089935 podStartE2EDuration="3.931366402s" podCreationTimestamp="2025-10-02 11:32:57 +0000 UTC" firstStartedPulling="2025-10-02 11:32:59.828429965 +0000 UTC m=+860.719583532" lastFinishedPulling="2025-10-02 11:32:59.871706432 +0000 UTC m=+860.762859999" observedRunningTime="2025-10-02 11:33:00.927908811 +0000 UTC m=+861.819062378" watchObservedRunningTime="2025-10-02 11:33:00.931366402 +0000 UTC m=+861.822519969" Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.954406 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-28gz8"] Oct 02 11:33:00 crc kubenswrapper[4658]: I1002 11:33:00.958568 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-28gz8"] Oct 02 11:33:01 crc kubenswrapper[4658]: I1002 11:33:01.956465 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f58d62d-8e0e-4c04-a027-8acc49d8edf4" path="/var/lib/kubelet/pods/5f58d62d-8e0e-4c04-a027-8acc49d8edf4/volumes" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.429467 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vt5sn"] Oct 02 11:33:06 crc kubenswrapper[4658]: E1002 11:33:06.430369 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f58d62d-8e0e-4c04-a027-8acc49d8edf4" containerName="registry-server" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.430402 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f58d62d-8e0e-4c04-a027-8acc49d8edf4" containerName="registry-server" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.430753 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f58d62d-8e0e-4c04-a027-8acc49d8edf4" containerName="registry-server" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.432902 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.437658 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vt5sn"] Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.518783 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj5dx\" (UniqueName: \"kubernetes.io/projected/b5eff629-5a30-4883-80bb-496aa931e670-kube-api-access-nj5dx\") pod \"community-operators-vt5sn\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.518857 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-utilities\") pod \"community-operators-vt5sn\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.518901 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-catalog-content\") pod \"community-operators-vt5sn\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.619772 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-utilities\") pod \"community-operators-vt5sn\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.619852 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-catalog-content\") pod \"community-operators-vt5sn\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.619936 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj5dx\" (UniqueName: \"kubernetes.io/projected/b5eff629-5a30-4883-80bb-496aa931e670-kube-api-access-nj5dx\") pod \"community-operators-vt5sn\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.620361 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-utilities\") pod \"community-operators-vt5sn\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.620429 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-catalog-content\") pod \"community-operators-vt5sn\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.641480 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj5dx\" (UniqueName: \"kubernetes.io/projected/b5eff629-5a30-4883-80bb-496aa931e670-kube-api-access-nj5dx\") pod \"community-operators-vt5sn\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:06 crc kubenswrapper[4658]: I1002 11:33:06.756789 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:07 crc kubenswrapper[4658]: I1002 11:33:07.248285 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vt5sn"] Oct 02 11:33:07 crc kubenswrapper[4658]: W1002 11:33:07.257644 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5eff629_5a30_4883_80bb_496aa931e670.slice/crio-cc25ec64c97252d06ecf5418b1f031c0119a2ec81bc0f74b4728a0755b5b874e WatchSource:0}: Error finding container cc25ec64c97252d06ecf5418b1f031c0119a2ec81bc0f74b4728a0755b5b874e: Status 404 returned error can't find the container with id cc25ec64c97252d06ecf5418b1f031c0119a2ec81bc0f74b4728a0755b5b874e Oct 02 11:33:07 crc kubenswrapper[4658]: I1002 11:33:07.955958 4658 generic.go:334] "Generic (PLEG): container finished" podID="b5eff629-5a30-4883-80bb-496aa931e670" containerID="52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9" exitCode=0 Oct 02 11:33:07 crc kubenswrapper[4658]: I1002 11:33:07.955995 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vt5sn" event={"ID":"b5eff629-5a30-4883-80bb-496aa931e670","Type":"ContainerDied","Data":"52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9"} Oct 02 11:33:07 crc kubenswrapper[4658]: I1002 11:33:07.956236 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vt5sn" event={"ID":"b5eff629-5a30-4883-80bb-496aa931e670","Type":"ContainerStarted","Data":"cc25ec64c97252d06ecf5418b1f031c0119a2ec81bc0f74b4728a0755b5b874e"} Oct 02 11:33:08 crc kubenswrapper[4658]: I1002 11:33:08.340943 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-4jglm" Oct 02 11:33:08 crc kubenswrapper[4658]: I1002 11:33:08.341002 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-4jglm" Oct 02 11:33:08 crc kubenswrapper[4658]: I1002 11:33:08.374243 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-4jglm" Oct 02 11:33:08 crc kubenswrapper[4658]: I1002 11:33:08.988929 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-4jglm" Oct 02 11:33:09 crc kubenswrapper[4658]: I1002 11:33:09.970998 4658 generic.go:334] "Generic (PLEG): container finished" podID="b5eff629-5a30-4883-80bb-496aa931e670" containerID="fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173" exitCode=0 Oct 02 11:33:09 crc kubenswrapper[4658]: I1002 11:33:09.971096 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vt5sn" event={"ID":"b5eff629-5a30-4883-80bb-496aa931e670","Type":"ContainerDied","Data":"fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173"} Oct 02 11:33:10 crc kubenswrapper[4658]: I1002 11:33:10.978007 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vt5sn" event={"ID":"b5eff629-5a30-4883-80bb-496aa931e670","Type":"ContainerStarted","Data":"6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9"} Oct 02 11:33:10 crc kubenswrapper[4658]: I1002 11:33:10.992830 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vt5sn" podStartSLOduration=2.407378736 podStartE2EDuration="4.992808626s" podCreationTimestamp="2025-10-02 11:33:06 +0000 UTC" firstStartedPulling="2025-10-02 11:33:07.959026363 +0000 UTC m=+868.850179930" lastFinishedPulling="2025-10-02 11:33:10.544456253 +0000 UTC m=+871.435609820" observedRunningTime="2025-10-02 11:33:10.991261586 +0000 UTC m=+871.882415153" watchObservedRunningTime="2025-10-02 11:33:10.992808626 +0000 UTC m=+871.883962193" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.258904 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m"] Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.260659 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.262770 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8l7rm" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.271708 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m"] Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.341088 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-util\") pod \"051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.341134 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-bundle\") pod \"051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.341172 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcvdq\" (UniqueName: \"kubernetes.io/projected/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-kube-api-access-vcvdq\") pod \"051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.442475 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcvdq\" (UniqueName: \"kubernetes.io/projected/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-kube-api-access-vcvdq\") pod \"051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.442978 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-util\") pod \"051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.443011 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-bundle\") pod \"051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.443507 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-util\") pod \"051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.443533 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-bundle\") pod \"051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.463349 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcvdq\" (UniqueName: \"kubernetes.io/projected/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-kube-api-access-vcvdq\") pod \"051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.582598 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:14 crc kubenswrapper[4658]: I1002 11:33:14.986861 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m"] Oct 02 11:33:15 crc kubenswrapper[4658]: I1002 11:33:15.003015 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" event={"ID":"8a703ab4-d1c1-417b-8f0b-7530ed09a26a","Type":"ContainerStarted","Data":"e50f044b6c99056c10aed5248324a761d59ab7d46e3cfb7c2dcaadb4cb83139f"} Oct 02 11:33:16 crc kubenswrapper[4658]: I1002 11:33:16.013096 4658 generic.go:334] "Generic (PLEG): container finished" podID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerID="5e7895894b3f092a5f317b1aabc90771568a5f8db312337b14fcc6060103df90" exitCode=0 Oct 02 11:33:16 crc kubenswrapper[4658]: I1002 11:33:16.013223 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" event={"ID":"8a703ab4-d1c1-417b-8f0b-7530ed09a26a","Type":"ContainerDied","Data":"5e7895894b3f092a5f317b1aabc90771568a5f8db312337b14fcc6060103df90"} Oct 02 11:33:16 crc kubenswrapper[4658]: I1002 11:33:16.757140 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:16 crc kubenswrapper[4658]: I1002 11:33:16.757198 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:16 crc kubenswrapper[4658]: I1002 11:33:16.806155 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:17 crc kubenswrapper[4658]: I1002 11:33:17.083851 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:18 crc kubenswrapper[4658]: I1002 11:33:18.035378 4658 generic.go:334] "Generic (PLEG): container finished" podID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerID="e5b8dc16e65d3588f1d5755f6e0b8b080e4e917ea39dbfadfc8a92c5a8f8dce5" exitCode=0 Oct 02 11:33:18 crc kubenswrapper[4658]: I1002 11:33:18.035478 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" event={"ID":"8a703ab4-d1c1-417b-8f0b-7530ed09a26a","Type":"ContainerDied","Data":"e5b8dc16e65d3588f1d5755f6e0b8b080e4e917ea39dbfadfc8a92c5a8f8dce5"} Oct 02 11:33:19 crc kubenswrapper[4658]: I1002 11:33:19.043491 4658 generic.go:334] "Generic (PLEG): container finished" podID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerID="c4c7a75875f68bf0f692fdf04d64cb15134717226994145df65cef9a29a99445" exitCode=0 Oct 02 11:33:19 crc kubenswrapper[4658]: I1002 11:33:19.043608 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" event={"ID":"8a703ab4-d1c1-417b-8f0b-7530ed09a26a","Type":"ContainerDied","Data":"c4c7a75875f68bf0f692fdf04d64cb15134717226994145df65cef9a29a99445"} Oct 02 11:33:19 crc kubenswrapper[4658]: I1002 11:33:19.604481 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vt5sn"] Oct 02 11:33:19 crc kubenswrapper[4658]: I1002 11:33:19.604820 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vt5sn" podUID="b5eff629-5a30-4883-80bb-496aa931e670" containerName="registry-server" containerID="cri-o://6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9" gracePeriod=2 Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.011497 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.051867 4658 generic.go:334] "Generic (PLEG): container finished" podID="b5eff629-5a30-4883-80bb-496aa931e670" containerID="6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9" exitCode=0 Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.052123 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vt5sn" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.052683 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vt5sn" event={"ID":"b5eff629-5a30-4883-80bb-496aa931e670","Type":"ContainerDied","Data":"6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9"} Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.052713 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vt5sn" event={"ID":"b5eff629-5a30-4883-80bb-496aa931e670","Type":"ContainerDied","Data":"cc25ec64c97252d06ecf5418b1f031c0119a2ec81bc0f74b4728a0755b5b874e"} Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.052732 4658 scope.go:117] "RemoveContainer" containerID="6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.080395 4658 scope.go:117] "RemoveContainer" containerID="fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.103859 4658 scope.go:117] "RemoveContainer" containerID="52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.121737 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj5dx\" (UniqueName: \"kubernetes.io/projected/b5eff629-5a30-4883-80bb-496aa931e670-kube-api-access-nj5dx\") pod \"b5eff629-5a30-4883-80bb-496aa931e670\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.121808 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-catalog-content\") pod \"b5eff629-5a30-4883-80bb-496aa931e670\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.121909 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-utilities\") pod \"b5eff629-5a30-4883-80bb-496aa931e670\" (UID: \"b5eff629-5a30-4883-80bb-496aa931e670\") " Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.123100 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-utilities" (OuterVolumeSpecName: "utilities") pod "b5eff629-5a30-4883-80bb-496aa931e670" (UID: "b5eff629-5a30-4883-80bb-496aa931e670"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.134555 4658 scope.go:117] "RemoveContainer" containerID="6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9" Oct 02 11:33:20 crc kubenswrapper[4658]: E1002 11:33:20.144427 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9\": container with ID starting with 6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9 not found: ID does not exist" containerID="6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.144468 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5eff629-5a30-4883-80bb-496aa931e670-kube-api-access-nj5dx" (OuterVolumeSpecName: "kube-api-access-nj5dx") pod "b5eff629-5a30-4883-80bb-496aa931e670" (UID: "b5eff629-5a30-4883-80bb-496aa931e670"). InnerVolumeSpecName "kube-api-access-nj5dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.144482 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9"} err="failed to get container status \"6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9\": rpc error: code = NotFound desc = could not find container \"6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9\": container with ID starting with 6169b4cd1cc7d555eff19d1dca26696538b1443562355debfcda67dca0ca9db9 not found: ID does not exist" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.144520 4658 scope.go:117] "RemoveContainer" containerID="fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173" Oct 02 11:33:20 crc kubenswrapper[4658]: E1002 11:33:20.145070 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173\": container with ID starting with fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173 not found: ID does not exist" containerID="fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.145125 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173"} err="failed to get container status \"fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173\": rpc error: code = NotFound desc = could not find container \"fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173\": container with ID starting with fd83b1fc9914222f08c887e3e110be554d1f568f3d987dfccfce2105ce6e2173 not found: ID does not exist" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.145159 4658 scope.go:117] "RemoveContainer" containerID="52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9" Oct 02 11:33:20 crc kubenswrapper[4658]: E1002 11:33:20.146220 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9\": container with ID starting with 52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9 not found: ID does not exist" containerID="52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.146267 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9"} err="failed to get container status \"52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9\": rpc error: code = NotFound desc = could not find container \"52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9\": container with ID starting with 52a5afcafc7f537657e8d616f41389db7628ebc9f917b9d81f69904620e8fad9 not found: ID does not exist" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.208488 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5eff629-5a30-4883-80bb-496aa931e670" (UID: "b5eff629-5a30-4883-80bb-496aa931e670"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.224032 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.224068 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj5dx\" (UniqueName: \"kubernetes.io/projected/b5eff629-5a30-4883-80bb-496aa931e670-kube-api-access-nj5dx\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.224077 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eff629-5a30-4883-80bb-496aa931e670-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.357572 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.399908 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vt5sn"] Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.405525 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vt5sn"] Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.431403 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-bundle\") pod \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.431491 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcvdq\" (UniqueName: \"kubernetes.io/projected/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-kube-api-access-vcvdq\") pod \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.431522 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-util\") pod \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\" (UID: \"8a703ab4-d1c1-417b-8f0b-7530ed09a26a\") " Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.432384 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-bundle" (OuterVolumeSpecName: "bundle") pod "8a703ab4-d1c1-417b-8f0b-7530ed09a26a" (UID: "8a703ab4-d1c1-417b-8f0b-7530ed09a26a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.435266 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-kube-api-access-vcvdq" (OuterVolumeSpecName: "kube-api-access-vcvdq") pod "8a703ab4-d1c1-417b-8f0b-7530ed09a26a" (UID: "8a703ab4-d1c1-417b-8f0b-7530ed09a26a"). InnerVolumeSpecName "kube-api-access-vcvdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.533036 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcvdq\" (UniqueName: \"kubernetes.io/projected/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-kube-api-access-vcvdq\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.533097 4658 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.892287 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-util" (OuterVolumeSpecName: "util") pod "8a703ab4-d1c1-417b-8f0b-7530ed09a26a" (UID: "8a703ab4-d1c1-417b-8f0b-7530ed09a26a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:33:20 crc kubenswrapper[4658]: I1002 11:33:20.938492 4658 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a703ab4-d1c1-417b-8f0b-7530ed09a26a-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:21 crc kubenswrapper[4658]: I1002 11:33:21.062286 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" Oct 02 11:33:21 crc kubenswrapper[4658]: I1002 11:33:21.062315 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m" event={"ID":"8a703ab4-d1c1-417b-8f0b-7530ed09a26a","Type":"ContainerDied","Data":"e50f044b6c99056c10aed5248324a761d59ab7d46e3cfb7c2dcaadb4cb83139f"} Oct 02 11:33:21 crc kubenswrapper[4658]: I1002 11:33:21.062357 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e50f044b6c99056c10aed5248324a761d59ab7d46e3cfb7c2dcaadb4cb83139f" Oct 02 11:33:21 crc kubenswrapper[4658]: I1002 11:33:21.956759 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5eff629-5a30-4883-80bb-496aa931e670" path="/var/lib/kubelet/pods/b5eff629-5a30-4883-80bb-496aa931e670/volumes" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.114230 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf"] Oct 02 11:33:23 crc kubenswrapper[4658]: E1002 11:33:23.114812 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerName="pull" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.114825 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerName="pull" Oct 02 11:33:23 crc kubenswrapper[4658]: E1002 11:33:23.114850 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5eff629-5a30-4883-80bb-496aa931e670" containerName="registry-server" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.114857 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5eff629-5a30-4883-80bb-496aa931e670" containerName="registry-server" Oct 02 11:33:23 crc kubenswrapper[4658]: E1002 11:33:23.114869 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5eff629-5a30-4883-80bb-496aa931e670" containerName="extract-utilities" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.114877 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5eff629-5a30-4883-80bb-496aa931e670" containerName="extract-utilities" Oct 02 11:33:23 crc kubenswrapper[4658]: E1002 11:33:23.114892 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerName="util" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.114898 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerName="util" Oct 02 11:33:23 crc kubenswrapper[4658]: E1002 11:33:23.114908 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5eff629-5a30-4883-80bb-496aa931e670" containerName="extract-content" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.114916 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5eff629-5a30-4883-80bb-496aa931e670" containerName="extract-content" Oct 02 11:33:23 crc kubenswrapper[4658]: E1002 11:33:23.114924 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerName="extract" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.114930 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerName="extract" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.115041 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5eff629-5a30-4883-80bb-496aa931e670" containerName="registry-server" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.115056 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a703ab4-d1c1-417b-8f0b-7530ed09a26a" containerName="extract" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.115803 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.119244 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-wf4c5" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.141721 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf"] Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.168964 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsddw\" (UniqueName: \"kubernetes.io/projected/943e808f-860b-4f8a-a933-84f0dd0cddc5-kube-api-access-nsddw\") pod \"openstack-operator-controller-operator-6f47f5dc76-j82tf\" (UID: \"943e808f-860b-4f8a-a933-84f0dd0cddc5\") " pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.270370 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsddw\" (UniqueName: \"kubernetes.io/projected/943e808f-860b-4f8a-a933-84f0dd0cddc5-kube-api-access-nsddw\") pod \"openstack-operator-controller-operator-6f47f5dc76-j82tf\" (UID: \"943e808f-860b-4f8a-a933-84f0dd0cddc5\") " pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.292266 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsddw\" (UniqueName: \"kubernetes.io/projected/943e808f-860b-4f8a-a933-84f0dd0cddc5-kube-api-access-nsddw\") pod \"openstack-operator-controller-operator-6f47f5dc76-j82tf\" (UID: \"943e808f-860b-4f8a-a933-84f0dd0cddc5\") " pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.433593 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" Oct 02 11:33:23 crc kubenswrapper[4658]: I1002 11:33:23.917411 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf"] Oct 02 11:33:23 crc kubenswrapper[4658]: W1002 11:33:23.924841 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943e808f_860b_4f8a_a933_84f0dd0cddc5.slice/crio-4a40510d30c943a86efa57c176d49e5355cbd579c9f05072ca14bac8999216df WatchSource:0}: Error finding container 4a40510d30c943a86efa57c176d49e5355cbd579c9f05072ca14bac8999216df: Status 404 returned error can't find the container with id 4a40510d30c943a86efa57c176d49e5355cbd579c9f05072ca14bac8999216df Oct 02 11:33:24 crc kubenswrapper[4658]: I1002 11:33:24.083590 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" event={"ID":"943e808f-860b-4f8a-a933-84f0dd0cddc5","Type":"ContainerStarted","Data":"4a40510d30c943a86efa57c176d49e5355cbd579c9f05072ca14bac8999216df"} Oct 02 11:33:28 crc kubenswrapper[4658]: I1002 11:33:28.111013 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" event={"ID":"943e808f-860b-4f8a-a933-84f0dd0cddc5","Type":"ContainerStarted","Data":"a2e707c9cb725affafcd1f55df242d2d65b1a5ea0f52e4f9742988b59efd6737"} Oct 02 11:33:31 crc kubenswrapper[4658]: I1002 11:33:31.130413 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" event={"ID":"943e808f-860b-4f8a-a933-84f0dd0cddc5","Type":"ContainerStarted","Data":"71a3ea1943dddf243bc5e7f6bad3195df2ea38caf7ce5bfef49ae0ae0de94dcd"} Oct 02 11:33:31 crc kubenswrapper[4658]: I1002 11:33:31.131007 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" Oct 02 11:33:31 crc kubenswrapper[4658]: I1002 11:33:31.165258 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" podStartSLOduration=1.625836021 podStartE2EDuration="8.165242302s" podCreationTimestamp="2025-10-02 11:33:23 +0000 UTC" firstStartedPulling="2025-10-02 11:33:23.92670233 +0000 UTC m=+884.817855897" lastFinishedPulling="2025-10-02 11:33:30.466108611 +0000 UTC m=+891.357262178" observedRunningTime="2025-10-02 11:33:31.164740636 +0000 UTC m=+892.055894203" watchObservedRunningTime="2025-10-02 11:33:31.165242302 +0000 UTC m=+892.056395869" Oct 02 11:33:32 crc kubenswrapper[4658]: I1002 11:33:32.140837 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6f47f5dc76-j82tf" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.331934 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tjtkq"] Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.335205 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.344849 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjtkq"] Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.412742 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2mmw\" (UniqueName: \"kubernetes.io/projected/b2562802-cdc8-4a40-89bd-9806d6150aca-kube-api-access-b2mmw\") pod \"redhat-marketplace-tjtkq\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.412799 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-utilities\") pod \"redhat-marketplace-tjtkq\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.412872 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-catalog-content\") pod \"redhat-marketplace-tjtkq\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.513780 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-utilities\") pod \"redhat-marketplace-tjtkq\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.513885 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-catalog-content\") pod \"redhat-marketplace-tjtkq\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.513929 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2mmw\" (UniqueName: \"kubernetes.io/projected/b2562802-cdc8-4a40-89bd-9806d6150aca-kube-api-access-b2mmw\") pod \"redhat-marketplace-tjtkq\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.514366 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-utilities\") pod \"redhat-marketplace-tjtkq\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.514383 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-catalog-content\") pod \"redhat-marketplace-tjtkq\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.548942 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2mmw\" (UniqueName: \"kubernetes.io/projected/b2562802-cdc8-4a40-89bd-9806d6150aca-kube-api-access-b2mmw\") pod \"redhat-marketplace-tjtkq\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:44 crc kubenswrapper[4658]: I1002 11:33:44.662142 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:45 crc kubenswrapper[4658]: I1002 11:33:45.129912 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjtkq"] Oct 02 11:33:45 crc kubenswrapper[4658]: I1002 11:33:45.214830 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjtkq" event={"ID":"b2562802-cdc8-4a40-89bd-9806d6150aca","Type":"ContainerStarted","Data":"38e219544507dd5e4bf7f0713f7ec7dd1bb8a6bab5dcb509a639a4964b2448ee"} Oct 02 11:33:46 crc kubenswrapper[4658]: I1002 11:33:46.223877 4658 generic.go:334] "Generic (PLEG): container finished" podID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerID="035e2c2e2289ca971ac92fddf5940f5f53a2507647e1b6bd41af661edfb56f6e" exitCode=0 Oct 02 11:33:46 crc kubenswrapper[4658]: I1002 11:33:46.223932 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjtkq" event={"ID":"b2562802-cdc8-4a40-89bd-9806d6150aca","Type":"ContainerDied","Data":"035e2c2e2289ca971ac92fddf5940f5f53a2507647e1b6bd41af661edfb56f6e"} Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.238314 4658 generic.go:334] "Generic (PLEG): container finished" podID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerID="cb1ad6f6553ca53de5e016f1a946cff2f56218fbf9bcd61c276805de3da90b72" exitCode=0 Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.238424 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjtkq" event={"ID":"b2562802-cdc8-4a40-89bd-9806d6150aca","Type":"ContainerDied","Data":"cb1ad6f6553ca53de5e016f1a946cff2f56218fbf9bcd61c276805de3da90b72"} Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.763604 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.764815 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.766912 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jcbx5" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.777625 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.785246 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.786401 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.788086 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dl4cd" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.801316 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.802571 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.804437 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vddj6" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.806044 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.820610 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.821883 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.824638 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9dsgr" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.825404 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.866369 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.882364 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.883795 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.886162 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.887272 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.887893 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lpwww" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.888854 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f47kj\" (UniqueName: \"kubernetes.io/projected/3f426838-95ca-4579-9745-e78f0ccab683-kube-api-access-f47kj\") pod \"barbican-operator-controller-manager-6ff8b75857-kkldn\" (UID: \"3f426838-95ca-4579-9745-e78f0ccab683\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.888943 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vwvw\" (UniqueName: \"kubernetes.io/projected/7744dcc1-5c52-4447-8123-53e4c98250fd-kube-api-access-8vwvw\") pod \"cinder-operator-controller-manager-644bddb6d8-gckv9\" (UID: \"7744dcc1-5c52-4447-8123-53e4c98250fd\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.889098 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-272xt\" (UniqueName: \"kubernetes.io/projected/af944184-d59a-467d-983e-c66fb79823c6-kube-api-access-272xt\") pod \"designate-operator-controller-manager-84f4f7b77b-fgm4w\" (UID: \"af944184-d59a-467d-983e-c66fb79823c6\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.892671 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rwb27" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.894969 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.946377 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.967816 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq"] Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.969166 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.977395 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rcrzp" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.977719 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.990000 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2kq5\" (UniqueName: \"kubernetes.io/projected/70026a4a-6db4-4777-afed-a5ea3de1fc60-kube-api-access-g2kq5\") pod \"glance-operator-controller-manager-84958c4d49-8ttj2\" (UID: \"70026a4a-6db4-4777-afed-a5ea3de1fc60\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.990062 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f47kj\" (UniqueName: \"kubernetes.io/projected/3f426838-95ca-4579-9745-e78f0ccab683-kube-api-access-f47kj\") pod \"barbican-operator-controller-manager-6ff8b75857-kkldn\" (UID: \"3f426838-95ca-4579-9745-e78f0ccab683\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.990113 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97cf7\" (UniqueName: \"kubernetes.io/projected/d480d1a6-c309-454f-8e99-a762feed8490-kube-api-access-97cf7\") pod \"horizon-operator-controller-manager-9f4696d94-ljz2h\" (UID: \"d480d1a6-c309-454f-8e99-a762feed8490\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.990148 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vwvw\" (UniqueName: \"kubernetes.io/projected/7744dcc1-5c52-4447-8123-53e4c98250fd-kube-api-access-8vwvw\") pod \"cinder-operator-controller-manager-644bddb6d8-gckv9\" (UID: \"7744dcc1-5c52-4447-8123-53e4c98250fd\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.990225 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-272xt\" (UniqueName: \"kubernetes.io/projected/af944184-d59a-467d-983e-c66fb79823c6-kube-api-access-272xt\") pod \"designate-operator-controller-manager-84f4f7b77b-fgm4w\" (UID: \"af944184-d59a-467d-983e-c66fb79823c6\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" Oct 02 11:33:48 crc kubenswrapper[4658]: I1002 11:33:48.990271 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blcw4\" (UniqueName: \"kubernetes.io/projected/6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c-kube-api-access-blcw4\") pod \"heat-operator-controller-manager-5d889d78cf-66q5b\" (UID: \"6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.002357 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.003944 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.008902 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-t654k" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.023766 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.024683 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vwvw\" (UniqueName: \"kubernetes.io/projected/7744dcc1-5c52-4447-8123-53e4c98250fd-kube-api-access-8vwvw\") pod \"cinder-operator-controller-manager-644bddb6d8-gckv9\" (UID: \"7744dcc1-5c52-4447-8123-53e4c98250fd\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.035524 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.036746 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.070347 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-272xt\" (UniqueName: \"kubernetes.io/projected/af944184-d59a-467d-983e-c66fb79823c6-kube-api-access-272xt\") pod \"designate-operator-controller-manager-84f4f7b77b-fgm4w\" (UID: \"af944184-d59a-467d-983e-c66fb79823c6\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.071024 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-lrfnw" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.071060 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f47kj\" (UniqueName: \"kubernetes.io/projected/3f426838-95ca-4579-9745-e78f0ccab683-kube-api-access-f47kj\") pod \"barbican-operator-controller-manager-6ff8b75857-kkldn\" (UID: \"3f426838-95ca-4579-9745-e78f0ccab683\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.098895 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2kq5\" (UniqueName: \"kubernetes.io/projected/70026a4a-6db4-4777-afed-a5ea3de1fc60-kube-api-access-g2kq5\") pod \"glance-operator-controller-manager-84958c4d49-8ttj2\" (UID: \"70026a4a-6db4-4777-afed-a5ea3de1fc60\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.099066 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97cf7\" (UniqueName: \"kubernetes.io/projected/d480d1a6-c309-454f-8e99-a762feed8490-kube-api-access-97cf7\") pod \"horizon-operator-controller-manager-9f4696d94-ljz2h\" (UID: \"d480d1a6-c309-454f-8e99-a762feed8490\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.099150 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x7f6\" (UniqueName: \"kubernetes.io/projected/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-kube-api-access-6x7f6\") pod \"infra-operator-controller-manager-9d6c5db85-kznvq\" (UID: \"f527a8e5-d051-4017-80e4-e3b2f1fd59ba\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.099192 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-cert\") pod \"infra-operator-controller-manager-9d6c5db85-kznvq\" (UID: \"f527a8e5-d051-4017-80e4-e3b2f1fd59ba\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.099397 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t57ps\" (UniqueName: \"kubernetes.io/projected/bf9ac0a3-4903-4115-9793-b6bd913d4e0a-kube-api-access-t57ps\") pod \"ironic-operator-controller-manager-5cd4858477-7mfsk\" (UID: \"bf9ac0a3-4903-4115-9793-b6bd913d4e0a\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.099497 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blcw4\" (UniqueName: \"kubernetes.io/projected/6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c-kube-api-access-blcw4\") pod \"heat-operator-controller-manager-5d889d78cf-66q5b\" (UID: \"6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.101949 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.117096 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.124749 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.125396 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.166811 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.175117 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2kq5\" (UniqueName: \"kubernetes.io/projected/70026a4a-6db4-4777-afed-a5ea3de1fc60-kube-api-access-g2kq5\") pod \"glance-operator-controller-manager-84958c4d49-8ttj2\" (UID: \"70026a4a-6db4-4777-afed-a5ea3de1fc60\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.180714 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97cf7\" (UniqueName: \"kubernetes.io/projected/d480d1a6-c309-454f-8e99-a762feed8490-kube-api-access-97cf7\") pod \"horizon-operator-controller-manager-9f4696d94-ljz2h\" (UID: \"d480d1a6-c309-454f-8e99-a762feed8490\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.181839 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blcw4\" (UniqueName: \"kubernetes.io/projected/6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c-kube-api-access-blcw4\") pod \"heat-operator-controller-manager-5d889d78cf-66q5b\" (UID: \"6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.183131 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.188481 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2hjn8" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.198422 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.200820 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x7f6\" (UniqueName: \"kubernetes.io/projected/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-kube-api-access-6x7f6\") pod \"infra-operator-controller-manager-9d6c5db85-kznvq\" (UID: \"f527a8e5-d051-4017-80e4-e3b2f1fd59ba\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.200864 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-cert\") pod \"infra-operator-controller-manager-9d6c5db85-kznvq\" (UID: \"f527a8e5-d051-4017-80e4-e3b2f1fd59ba\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.200901 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjxh\" (UniqueName: \"kubernetes.io/projected/d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b-kube-api-access-gbjxh\") pod \"keystone-operator-controller-manager-5bd55b4bff-l62bl\" (UID: \"d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.200944 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t57ps\" (UniqueName: \"kubernetes.io/projected/bf9ac0a3-4903-4115-9793-b6bd913d4e0a-kube-api-access-t57ps\") pod \"ironic-operator-controller-manager-5cd4858477-7mfsk\" (UID: \"bf9ac0a3-4903-4115-9793-b6bd913d4e0a\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" Oct 02 11:33:49 crc kubenswrapper[4658]: E1002 11:33:49.201943 4658 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 02 11:33:49 crc kubenswrapper[4658]: E1002 11:33:49.202143 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-cert podName:f527a8e5-d051-4017-80e4-e3b2f1fd59ba nodeName:}" failed. No retries permitted until 2025-10-02 11:33:49.702125028 +0000 UTC m=+910.593278595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-cert") pod "infra-operator-controller-manager-9d6c5db85-kznvq" (UID: "f527a8e5-d051-4017-80e4-e3b2f1fd59ba") : secret "infra-operator-webhook-server-cert" not found Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.206740 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.217707 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.219045 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.221612 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-r25bp" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.222454 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x7f6\" (UniqueName: \"kubernetes.io/projected/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-kube-api-access-6x7f6\") pod \"infra-operator-controller-manager-9d6c5db85-kznvq\" (UID: \"f527a8e5-d051-4017-80e4-e3b2f1fd59ba\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.227440 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.227762 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t57ps\" (UniqueName: \"kubernetes.io/projected/bf9ac0a3-4903-4115-9793-b6bd913d4e0a-kube-api-access-t57ps\") pod \"ironic-operator-controller-manager-5cd4858477-7mfsk\" (UID: \"bf9ac0a3-4903-4115-9793-b6bd913d4e0a\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.230813 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.232090 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.239761 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4f585" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.240593 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.243200 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.244460 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.246678 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mpv6w" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.254763 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.261336 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.272419 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.274007 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.276038 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-57hss" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.285133 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjtkq" event={"ID":"b2562802-cdc8-4a40-89bd-9806d6150aca","Type":"ContainerStarted","Data":"841c49b85a53a00908e955f60a8cbde6366352ef8e87dea4124bee9242152270"} Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.290501 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.301348 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.308738 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjxh\" (UniqueName: \"kubernetes.io/projected/d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b-kube-api-access-gbjxh\") pod \"keystone-operator-controller-manager-5bd55b4bff-l62bl\" (UID: \"d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.308884 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl79j\" (UniqueName: \"kubernetes.io/projected/9787421c-8d35-4d30-8946-90bc71eba9c0-kube-api-access-hl79j\") pod \"mariadb-operator-controller-manager-88c7-g8dwz\" (UID: \"9787421c-8d35-4d30-8946-90bc71eba9c0\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.308961 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg9jd\" (UniqueName: \"kubernetes.io/projected/55b04e2c-c701-4f74-9fb6-1dce9d2de108-kube-api-access-hg9jd\") pod \"manila-operator-controller-manager-6d68dbc695-tnfxq\" (UID: \"55b04e2c-c701-4f74-9fb6-1dce9d2de108\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.311923 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.313058 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.328587 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-fk7cr" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.335201 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.336630 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.356091 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.357903 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-dtqmf" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.358183 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.359628 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjxh\" (UniqueName: \"kubernetes.io/projected/d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b-kube-api-access-gbjxh\") pod \"keystone-operator-controller-manager-5bd55b4bff-l62bl\" (UID: \"d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.359723 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.387284 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.397626 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.397748 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.401145 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-mlmf6" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.406680 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.407834 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.408351 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.410156 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vw2zg" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.414201 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl79j\" (UniqueName: \"kubernetes.io/projected/9787421c-8d35-4d30-8946-90bc71eba9c0-kube-api-access-hl79j\") pod \"mariadb-operator-controller-manager-88c7-g8dwz\" (UID: \"9787421c-8d35-4d30-8946-90bc71eba9c0\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.414255 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg9jd\" (UniqueName: \"kubernetes.io/projected/55b04e2c-c701-4f74-9fb6-1dce9d2de108-kube-api-access-hg9jd\") pod \"manila-operator-controller-manager-6d68dbc695-tnfxq\" (UID: \"55b04e2c-c701-4f74-9fb6-1dce9d2de108\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.414287 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndjhk\" (UniqueName: \"kubernetes.io/projected/5aeb03f1-db88-497b-b3cb-11e01e2a7b31-kube-api-access-ndjhk\") pod \"octavia-operator-controller-manager-7b787867f4-kbj6t\" (UID: \"5aeb03f1-db88-497b-b3cb-11e01e2a7b31\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.414346 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p7vh\" (UniqueName: \"kubernetes.io/projected/7b2e2130-4b00-4242-8254-c8be160bfe89-kube-api-access-4p7vh\") pod \"ovn-operator-controller-manager-9976ff44c-mhrcv\" (UID: \"7b2e2130-4b00-4242-8254-c8be160bfe89\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.414376 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txk8f\" (UniqueName: \"kubernetes.io/projected/6a460926-8982-40c1-b177-3620aa3dcb79-kube-api-access-txk8f\") pod \"neutron-operator-controller-manager-849d5b9b84-fsnf7\" (UID: \"6a460926-8982-40c1-b177-3620aa3dcb79\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.414429 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xb6z\" (UniqueName: \"kubernetes.io/projected/830f6e33-ad1f-4033-a725-9f10415996e7-kube-api-access-6xb6z\") pod \"nova-operator-controller-manager-64cd67b5cb-htz9g\" (UID: \"830f6e33-ad1f-4033-a725-9f10415996e7\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.418468 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.419931 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.443370 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.450697 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jcmxz" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.456984 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.457623 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl79j\" (UniqueName: \"kubernetes.io/projected/9787421c-8d35-4d30-8946-90bc71eba9c0-kube-api-access-hl79j\") pod \"mariadb-operator-controller-manager-88c7-g8dwz\" (UID: \"9787421c-8d35-4d30-8946-90bc71eba9c0\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.463065 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.471997 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg9jd\" (UniqueName: \"kubernetes.io/projected/55b04e2c-c701-4f74-9fb6-1dce9d2de108-kube-api-access-hg9jd\") pod \"manila-operator-controller-manager-6d68dbc695-tnfxq\" (UID: \"55b04e2c-c701-4f74-9fb6-1dce9d2de108\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.480481 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.483659 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-4bhqs"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.491475 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.503066 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-4bhqs"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.515500 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-skwmh" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.517507 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn78j\" (UniqueName: \"kubernetes.io/projected/afbaa143-b11e-406d-b797-6ba114fbf9a4-kube-api-access-bn78j\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ffhdh\" (UID: \"afbaa143-b11e-406d-b797-6ba114fbf9a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.517577 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5xm\" (UniqueName: \"kubernetes.io/projected/c92dcd56-734e-430c-813e-1405ab2e141b-kube-api-access-zh5xm\") pod \"swift-operator-controller-manager-84d6b4b759-ppg68\" (UID: \"c92dcd56-734e-430c-813e-1405ab2e141b\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.517612 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thm6r\" (UniqueName: \"kubernetes.io/projected/33b8c756-1330-4114-bf78-2b3835667a1e-kube-api-access-thm6r\") pod \"telemetry-operator-controller-manager-b8d54b5d7-49k5r\" (UID: \"33b8c756-1330-4114-bf78-2b3835667a1e\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.517656 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndjhk\" (UniqueName: \"kubernetes.io/projected/5aeb03f1-db88-497b-b3cb-11e01e2a7b31-kube-api-access-ndjhk\") pod \"octavia-operator-controller-manager-7b787867f4-kbj6t\" (UID: \"5aeb03f1-db88-497b-b3cb-11e01e2a7b31\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.517695 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p7vh\" (UniqueName: \"kubernetes.io/projected/7b2e2130-4b00-4242-8254-c8be160bfe89-kube-api-access-4p7vh\") pod \"ovn-operator-controller-manager-9976ff44c-mhrcv\" (UID: \"7b2e2130-4b00-4242-8254-c8be160bfe89\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.517723 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbaa143-b11e-406d-b797-6ba114fbf9a4-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ffhdh\" (UID: \"afbaa143-b11e-406d-b797-6ba114fbf9a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.517756 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkls8\" (UniqueName: \"kubernetes.io/projected/c802dbff-c65f-40e9-91ee-3ea6f0aee6a2-kube-api-access-mkls8\") pod \"placement-operator-controller-manager-589c58c6c-wqqdv\" (UID: \"c802dbff-c65f-40e9-91ee-3ea6f0aee6a2\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.517785 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txk8f\" (UniqueName: \"kubernetes.io/projected/6a460926-8982-40c1-b177-3620aa3dcb79-kube-api-access-txk8f\") pod \"neutron-operator-controller-manager-849d5b9b84-fsnf7\" (UID: \"6a460926-8982-40c1-b177-3620aa3dcb79\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.517859 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xb6z\" (UniqueName: \"kubernetes.io/projected/830f6e33-ad1f-4033-a725-9f10415996e7-kube-api-access-6xb6z\") pod \"nova-operator-controller-manager-64cd67b5cb-htz9g\" (UID: \"830f6e33-ad1f-4033-a725-9f10415996e7\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.525440 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.534278 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tjtkq" podStartSLOduration=3.038429552 podStartE2EDuration="5.534246427s" podCreationTimestamp="2025-10-02 11:33:44 +0000 UTC" firstStartedPulling="2025-10-02 11:33:46.226967885 +0000 UTC m=+907.118121452" lastFinishedPulling="2025-10-02 11:33:48.72278476 +0000 UTC m=+909.613938327" observedRunningTime="2025-10-02 11:33:49.329091685 +0000 UTC m=+910.220245272" watchObservedRunningTime="2025-10-02 11:33:49.534246427 +0000 UTC m=+910.425399994" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.552309 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.557405 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.591183 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.601084 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p7vh\" (UniqueName: \"kubernetes.io/projected/7b2e2130-4b00-4242-8254-c8be160bfe89-kube-api-access-4p7vh\") pod \"ovn-operator-controller-manager-9976ff44c-mhrcv\" (UID: \"7b2e2130-4b00-4242-8254-c8be160bfe89\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.614807 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-8qlwg" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.623313 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5xm\" (UniqueName: \"kubernetes.io/projected/c92dcd56-734e-430c-813e-1405ab2e141b-kube-api-access-zh5xm\") pod \"swift-operator-controller-manager-84d6b4b759-ppg68\" (UID: \"c92dcd56-734e-430c-813e-1405ab2e141b\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.623375 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thm6r\" (UniqueName: \"kubernetes.io/projected/33b8c756-1330-4114-bf78-2b3835667a1e-kube-api-access-thm6r\") pod \"telemetry-operator-controller-manager-b8d54b5d7-49k5r\" (UID: \"33b8c756-1330-4114-bf78-2b3835667a1e\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.623418 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpbst\" (UniqueName: \"kubernetes.io/projected/3dba06c0-4986-438c-a553-76b0bcddd74c-kube-api-access-wpbst\") pod \"test-operator-controller-manager-85777745bb-4bhqs\" (UID: \"3dba06c0-4986-438c-a553-76b0bcddd74c\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.623477 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbaa143-b11e-406d-b797-6ba114fbf9a4-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ffhdh\" (UID: \"afbaa143-b11e-406d-b797-6ba114fbf9a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.623521 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkls8\" (UniqueName: \"kubernetes.io/projected/c802dbff-c65f-40e9-91ee-3ea6f0aee6a2-kube-api-access-mkls8\") pod \"placement-operator-controller-manager-589c58c6c-wqqdv\" (UID: \"c802dbff-c65f-40e9-91ee-3ea6f0aee6a2\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.623857 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn78j\" (UniqueName: \"kubernetes.io/projected/afbaa143-b11e-406d-b797-6ba114fbf9a4-kube-api-access-bn78j\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ffhdh\" (UID: \"afbaa143-b11e-406d-b797-6ba114fbf9a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.630104 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9"] Oct 02 11:33:49 crc kubenswrapper[4658]: E1002 11:33:49.630473 4658 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:33:49 crc kubenswrapper[4658]: E1002 11:33:49.630553 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbaa143-b11e-406d-b797-6ba114fbf9a4-cert podName:afbaa143-b11e-406d-b797-6ba114fbf9a4 nodeName:}" failed. No retries permitted until 2025-10-02 11:33:50.130531828 +0000 UTC m=+911.021685395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/afbaa143-b11e-406d-b797-6ba114fbf9a4-cert") pod "openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" (UID: "afbaa143-b11e-406d-b797-6ba114fbf9a4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.637252 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xb6z\" (UniqueName: \"kubernetes.io/projected/830f6e33-ad1f-4033-a725-9f10415996e7-kube-api-access-6xb6z\") pod \"nova-operator-controller-manager-64cd67b5cb-htz9g\" (UID: \"830f6e33-ad1f-4033-a725-9f10415996e7\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.644944 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndjhk\" (UniqueName: \"kubernetes.io/projected/5aeb03f1-db88-497b-b3cb-11e01e2a7b31-kube-api-access-ndjhk\") pod \"octavia-operator-controller-manager-7b787867f4-kbj6t\" (UID: \"5aeb03f1-db88-497b-b3cb-11e01e2a7b31\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.645118 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txk8f\" (UniqueName: \"kubernetes.io/projected/6a460926-8982-40c1-b177-3620aa3dcb79-kube-api-access-txk8f\") pod \"neutron-operator-controller-manager-849d5b9b84-fsnf7\" (UID: \"6a460926-8982-40c1-b177-3620aa3dcb79\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.677724 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thm6r\" (UniqueName: \"kubernetes.io/projected/33b8c756-1330-4114-bf78-2b3835667a1e-kube-api-access-thm6r\") pod \"telemetry-operator-controller-manager-b8d54b5d7-49k5r\" (UID: \"33b8c756-1330-4114-bf78-2b3835667a1e\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.683782 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5xm\" (UniqueName: \"kubernetes.io/projected/c92dcd56-734e-430c-813e-1405ab2e141b-kube-api-access-zh5xm\") pod \"swift-operator-controller-manager-84d6b4b759-ppg68\" (UID: \"c92dcd56-734e-430c-813e-1405ab2e141b\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.688376 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.689841 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkls8\" (UniqueName: \"kubernetes.io/projected/c802dbff-c65f-40e9-91ee-3ea6f0aee6a2-kube-api-access-mkls8\") pod \"placement-operator-controller-manager-589c58c6c-wqqdv\" (UID: \"c802dbff-c65f-40e9-91ee-3ea6f0aee6a2\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.691065 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn78j\" (UniqueName: \"kubernetes.io/projected/afbaa143-b11e-406d-b797-6ba114fbf9a4-kube-api-access-bn78j\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ffhdh\" (UID: \"afbaa143-b11e-406d-b797-6ba114fbf9a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.729585 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.730871 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-cert\") pod \"infra-operator-controller-manager-9d6c5db85-kznvq\" (UID: \"f527a8e5-d051-4017-80e4-e3b2f1fd59ba\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.730941 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npd6m\" (UniqueName: \"kubernetes.io/projected/e9eb741d-265d-4f59-ab6e-c6a42f720801-kube-api-access-npd6m\") pod \"watcher-operator-controller-manager-7fc7d86889-mqpv9\" (UID: \"e9eb741d-265d-4f59-ab6e-c6a42f720801\") " pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.730986 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpbst\" (UniqueName: \"kubernetes.io/projected/3dba06c0-4986-438c-a553-76b0bcddd74c-kube-api-access-wpbst\") pod \"test-operator-controller-manager-85777745bb-4bhqs\" (UID: \"3dba06c0-4986-438c-a553-76b0bcddd74c\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.731122 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:49 crc kubenswrapper[4658]: E1002 11:33:49.731451 4658 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 02 11:33:49 crc kubenswrapper[4658]: E1002 11:33:49.731508 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-cert podName:f527a8e5-d051-4017-80e4-e3b2f1fd59ba nodeName:}" failed. No retries permitted until 2025-10-02 11:33:50.731490077 +0000 UTC m=+911.622643644 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-cert") pod "infra-operator-controller-manager-9d6c5db85-kznvq" (UID: "f527a8e5-d051-4017-80e4-e3b2f1fd59ba") : secret "infra-operator-webhook-server-cert" not found Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.733216 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-249z9" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.734024 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.758877 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.777894 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.781545 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.787063 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpbst\" (UniqueName: \"kubernetes.io/projected/3dba06c0-4986-438c-a553-76b0bcddd74c-kube-api-access-wpbst\") pod \"test-operator-controller-manager-85777745bb-4bhqs\" (UID: \"3dba06c0-4986-438c-a553-76b0bcddd74c\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.789687 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sw5rp" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.813523 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t"] Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.831231 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.831963 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903dfdb7-34f3-4875-8009-482cb7d5469b-cert\") pod \"openstack-operator-controller-manager-f6b64f7bf-8c66j\" (UID: \"903dfdb7-34f3-4875-8009-482cb7d5469b\") " pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.832026 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2l5q\" (UniqueName: \"kubernetes.io/projected/903dfdb7-34f3-4875-8009-482cb7d5469b-kube-api-access-w2l5q\") pod \"openstack-operator-controller-manager-f6b64f7bf-8c66j\" (UID: \"903dfdb7-34f3-4875-8009-482cb7d5469b\") " pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.832051 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npd6m\" (UniqueName: \"kubernetes.io/projected/e9eb741d-265d-4f59-ab6e-c6a42f720801-kube-api-access-npd6m\") pod \"watcher-operator-controller-manager-7fc7d86889-mqpv9\" (UID: \"e9eb741d-265d-4f59-ab6e-c6a42f720801\") " pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.857980 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npd6m\" (UniqueName: \"kubernetes.io/projected/e9eb741d-265d-4f59-ab6e-c6a42f720801-kube-api-access-npd6m\") pod \"watcher-operator-controller-manager-7fc7d86889-mqpv9\" (UID: \"e9eb741d-265d-4f59-ab6e-c6a42f720801\") " pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.890080 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.918528 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.919546 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.933655 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f5st\" (UniqueName: \"kubernetes.io/projected/75df76ba-0998-4b89-887e-d8f0b1c546b4-kube-api-access-7f5st\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-xw82t\" (UID: \"75df76ba-0998-4b89-887e-d8f0b1c546b4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.933708 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903dfdb7-34f3-4875-8009-482cb7d5469b-cert\") pod \"openstack-operator-controller-manager-f6b64f7bf-8c66j\" (UID: \"903dfdb7-34f3-4875-8009-482cb7d5469b\") " pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.933749 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2l5q\" (UniqueName: \"kubernetes.io/projected/903dfdb7-34f3-4875-8009-482cb7d5469b-kube-api-access-w2l5q\") pod \"openstack-operator-controller-manager-f6b64f7bf-8c66j\" (UID: \"903dfdb7-34f3-4875-8009-482cb7d5469b\") " pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:49 crc kubenswrapper[4658]: E1002 11:33:49.934164 4658 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 11:33:49 crc kubenswrapper[4658]: E1002 11:33:49.934207 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/903dfdb7-34f3-4875-8009-482cb7d5469b-cert podName:903dfdb7-34f3-4875-8009-482cb7d5469b nodeName:}" failed. No retries permitted until 2025-10-02 11:33:50.434191022 +0000 UTC m=+911.325344589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/903dfdb7-34f3-4875-8009-482cb7d5469b-cert") pod "openstack-operator-controller-manager-f6b64f7bf-8c66j" (UID: "903dfdb7-34f3-4875-8009-482cb7d5469b") : secret "webhook-server-cert" not found Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.957701 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.973586 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.977153 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.982963 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2l5q\" (UniqueName: \"kubernetes.io/projected/903dfdb7-34f3-4875-8009-482cb7d5469b-kube-api-access-w2l5q\") pod \"openstack-operator-controller-manager-f6b64f7bf-8c66j\" (UID: \"903dfdb7-34f3-4875-8009-482cb7d5469b\") " pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:49 crc kubenswrapper[4658]: I1002 11:33:49.990533 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.034731 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f5st\" (UniqueName: \"kubernetes.io/projected/75df76ba-0998-4b89-887e-d8f0b1c546b4-kube-api-access-7f5st\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-xw82t\" (UID: \"75df76ba-0998-4b89-887e-d8f0b1c546b4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.049269 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn"] Oct 02 11:33:50 crc kubenswrapper[4658]: W1002 11:33:50.055145 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f426838_95ca_4579_9745_e78f0ccab683.slice/crio-d92d38883c5f5b5d3f747edc9da13aaf3bcf284f1e1547f6fb8abb8a5fc6aebd WatchSource:0}: Error finding container d92d38883c5f5b5d3f747edc9da13aaf3bcf284f1e1547f6fb8abb8a5fc6aebd: Status 404 returned error can't find the container with id d92d38883c5f5b5d3f747edc9da13aaf3bcf284f1e1547f6fb8abb8a5fc6aebd Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.057435 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f5st\" (UniqueName: \"kubernetes.io/projected/75df76ba-0998-4b89-887e-d8f0b1c546b4-kube-api-access-7f5st\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-xw82t\" (UID: \"75df76ba-0998-4b89-887e-d8f0b1c546b4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.093003 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.097974 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9"] Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.135582 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbaa143-b11e-406d-b797-6ba114fbf9a4-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ffhdh\" (UID: \"afbaa143-b11e-406d-b797-6ba114fbf9a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.152649 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbaa143-b11e-406d-b797-6ba114fbf9a4-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ffhdh\" (UID: \"afbaa143-b11e-406d-b797-6ba114fbf9a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.308822 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" event={"ID":"7744dcc1-5c52-4447-8123-53e4c98250fd","Type":"ContainerStarted","Data":"7d7bfcab5ad41c79867af3efac1e2b236eb993e97478f0d4c033cdf7da501412"} Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.310661 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" event={"ID":"3f426838-95ca-4579-9745-e78f0ccab683","Type":"ContainerStarted","Data":"d92d38883c5f5b5d3f747edc9da13aaf3bcf284f1e1547f6fb8abb8a5fc6aebd"} Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.390612 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b"] Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.397772 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w"] Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.403592 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk"] Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.405932 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.444017 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903dfdb7-34f3-4875-8009-482cb7d5469b-cert\") pod \"openstack-operator-controller-manager-f6b64f7bf-8c66j\" (UID: \"903dfdb7-34f3-4875-8009-482cb7d5469b\") " pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:50 crc kubenswrapper[4658]: E1002 11:33:50.444547 4658 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 11:33:50 crc kubenswrapper[4658]: E1002 11:33:50.444644 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/903dfdb7-34f3-4875-8009-482cb7d5469b-cert podName:903dfdb7-34f3-4875-8009-482cb7d5469b nodeName:}" failed. No retries permitted until 2025-10-02 11:33:51.444623569 +0000 UTC m=+912.335777136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/903dfdb7-34f3-4875-8009-482cb7d5469b-cert") pod "openstack-operator-controller-manager-f6b64f7bf-8c66j" (UID: "903dfdb7-34f3-4875-8009-482cb7d5469b") : secret "webhook-server-cert" not found Oct 02 11:33:50 crc kubenswrapper[4658]: W1002 11:33:50.521046 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf9ac0a3_4903_4115_9793_b6bd913d4e0a.slice/crio-0fc88226bdc4e90dff70384926649a3d1d395223567f4fce2a9130a8be49dd60 WatchSource:0}: Error finding container 0fc88226bdc4e90dff70384926649a3d1d395223567f4fce2a9130a8be49dd60: Status 404 returned error can't find the container with id 0fc88226bdc4e90dff70384926649a3d1d395223567f4fce2a9130a8be49dd60 Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.750644 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2"] Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.754747 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-cert\") pod \"infra-operator-controller-manager-9d6c5db85-kznvq\" (UID: \"f527a8e5-d051-4017-80e4-e3b2f1fd59ba\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:50 crc kubenswrapper[4658]: W1002 11:33:50.759822 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70026a4a_6db4_4777_afed_a5ea3de1fc60.slice/crio-f6e674714fca4d46c86eea9de076210e68a52bed0336485f8fa02c5b698ee8ce WatchSource:0}: Error finding container f6e674714fca4d46c86eea9de076210e68a52bed0336485f8fa02c5b698ee8ce: Status 404 returned error can't find the container with id f6e674714fca4d46c86eea9de076210e68a52bed0336485f8fa02c5b698ee8ce Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.760641 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f527a8e5-d051-4017-80e4-e3b2f1fd59ba-cert\") pod \"infra-operator-controller-manager-9d6c5db85-kznvq\" (UID: \"f527a8e5-d051-4017-80e4-e3b2f1fd59ba\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.770974 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h"] Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.810126 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.880059 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq"] Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.885113 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv"] Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.914859 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g"] Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.924177 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv"] Oct 02 11:33:50 crc kubenswrapper[4658]: W1002 11:33:50.930337 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b2e2130_4b00_4242_8254_c8be160bfe89.slice/crio-29188b6718c271ae097af677e1f32c7f26a430d885850aeb54cf5992d00b682d WatchSource:0}: Error finding container 29188b6718c271ae097af677e1f32c7f26a430d885850aeb54cf5992d00b682d: Status 404 returned error can't find the container with id 29188b6718c271ae097af677e1f32c7f26a430d885850aeb54cf5992d00b682d Oct 02 11:33:50 crc kubenswrapper[4658]: I1002 11:33:50.944627 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl"] Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.249465 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r"] Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.266522 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7"] Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.292942 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh"] Oct 02 11:33:51 crc kubenswrapper[4658]: W1002 11:33:51.321103 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a460926_8982_40c1_b177_3620aa3dcb79.slice/crio-eb8f560ce6f9f609ada5d67613a850c38fb3436578f5b84cae5963ff7c073782 WatchSource:0}: Error finding container eb8f560ce6f9f609ada5d67613a850c38fb3436578f5b84cae5963ff7c073782: Status 404 returned error can't find the container with id eb8f560ce6f9f609ada5d67613a850c38fb3436578f5b84cae5963ff7c073782 Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.328185 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9"] Oct 02 11:33:51 crc kubenswrapper[4658]: W1002 11:33:51.334409 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafbaa143_b11e_406d_b797_6ba114fbf9a4.slice/crio-bcfc3f200e90f5627110a0e6242ea1ccb42ece5fe5ec99aa7165769ffcbd6976 WatchSource:0}: Error finding container bcfc3f200e90f5627110a0e6242ea1ccb42ece5fe5ec99aa7165769ffcbd6976: Status 404 returned error can't find the container with id bcfc3f200e90f5627110a0e6242ea1ccb42ece5fe5ec99aa7165769ffcbd6976 Oct 02 11:33:51 crc kubenswrapper[4658]: E1002 11:33:51.354062 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ndjhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7b787867f4-kbj6t_openstack-operators(5aeb03f1-db88-497b-b3cb-11e01e2a7b31): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.355154 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t"] Oct 02 11:33:51 crc kubenswrapper[4658]: E1002 11:33:51.355190 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bn78j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5869cb545-ffhdh_openstack-operators(afbaa143-b11e-406d-b797-6ba114fbf9a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.355414 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" event={"ID":"c802dbff-c65f-40e9-91ee-3ea6f0aee6a2","Type":"ContainerStarted","Data":"8f7cc8852ddcecb81865978f092cce4dad1c60147e4379fc4353d0c43b938211"} Oct 02 11:33:51 crc kubenswrapper[4658]: E1002 11:33:51.356472 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6x7f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-9d6c5db85-kznvq_openstack-operators(f527a8e5-d051-4017-80e4-e3b2f1fd59ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:33:51 crc kubenswrapper[4658]: E1002 11:33:51.356674 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7f5st,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-xw82t_openstack-operators(75df76ba-0998-4b89-887e-d8f0b1c546b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:33:51 crc kubenswrapper[4658]: E1002 11:33:51.358379 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" podUID="75df76ba-0998-4b89-887e-d8f0b1c546b4" Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.361100 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" event={"ID":"7b2e2130-4b00-4242-8254-c8be160bfe89","Type":"ContainerStarted","Data":"29188b6718c271ae097af677e1f32c7f26a430d885850aeb54cf5992d00b682d"} Oct 02 11:33:51 crc kubenswrapper[4658]: E1002 11:33:51.363118 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hl79j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-88c7-g8dwz_openstack-operators(9787421c-8d35-4d30-8946-90bc71eba9c0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.364436 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" event={"ID":"55b04e2c-c701-4f74-9fb6-1dce9d2de108","Type":"ContainerStarted","Data":"116813d4ef5341d66cc914fa8979f9380151ca9658f5f4f7cd407fcec5c6dcc3"} Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.365423 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq"] Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.368562 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" event={"ID":"70026a4a-6db4-4777-afed-a5ea3de1fc60","Type":"ContainerStarted","Data":"f6e674714fca4d46c86eea9de076210e68a52bed0336485f8fa02c5b698ee8ce"} Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.372344 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t"] Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.376340 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" event={"ID":"bf9ac0a3-4903-4115-9793-b6bd913d4e0a","Type":"ContainerStarted","Data":"0fc88226bdc4e90dff70384926649a3d1d395223567f4fce2a9130a8be49dd60"} Oct 02 11:33:51 crc kubenswrapper[4658]: E1002 11:33:51.376400 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zh5xm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-84d6b4b759-ppg68_openstack-operators(c92dcd56-734e-430c-813e-1405ab2e141b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.377142 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-4bhqs"] Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.378264 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" event={"ID":"d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b","Type":"ContainerStarted","Data":"cf140700a1b38932ea965d5c429fab26fa0bd1c39b55fb99cdcc75ac23960a02"} Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.382096 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz"] Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.382782 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" event={"ID":"d480d1a6-c309-454f-8e99-a762feed8490","Type":"ContainerStarted","Data":"712ee837fdcaee6a1a6ea7983128c24c830ff32800a2862bfe91d82c6f54e4a4"} Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.385225 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" event={"ID":"e9eb741d-265d-4f59-ab6e-c6a42f720801","Type":"ContainerStarted","Data":"3a8115b6cc3237ac2d235262c4889c4268a03891e741eaf3bb698da19a12275f"} Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.387638 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" event={"ID":"830f6e33-ad1f-4033-a725-9f10415996e7","Type":"ContainerStarted","Data":"f1f9b88d294ba3b1943fcd1273b626e0330253310d24fbc3f015bb83bc92db43"} Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.387902 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68"] Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.389541 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" event={"ID":"33b8c756-1330-4114-bf78-2b3835667a1e","Type":"ContainerStarted","Data":"fbfaacc6ac145d9e23d89dfe73e884309f480ff45345f7b2a1e894973aa98c4e"} Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.390712 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" event={"ID":"af944184-d59a-467d-983e-c66fb79823c6","Type":"ContainerStarted","Data":"da397e8f8fd54e717d319823a15bd095f7498fcd0e7cf0c6d9bb2740ac703a2d"} Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.391804 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" event={"ID":"6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c","Type":"ContainerStarted","Data":"96f49aefb719c151395e6ce8acd5e846dc46776036315ef991e3b2bf362d9d3a"} Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.475374 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903dfdb7-34f3-4875-8009-482cb7d5469b-cert\") pod \"openstack-operator-controller-manager-f6b64f7bf-8c66j\" (UID: \"903dfdb7-34f3-4875-8009-482cb7d5469b\") " pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.487170 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903dfdb7-34f3-4875-8009-482cb7d5469b-cert\") pod \"openstack-operator-controller-manager-f6b64f7bf-8c66j\" (UID: \"903dfdb7-34f3-4875-8009-482cb7d5469b\") " pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:51 crc kubenswrapper[4658]: I1002 11:33:51.575912 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.020044 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j"] Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.190472 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" podUID="5aeb03f1-db88-497b-b3cb-11e01e2a7b31" Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.201166 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" podUID="afbaa143-b11e-406d-b797-6ba114fbf9a4" Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.202505 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" podUID="c92dcd56-734e-430c-813e-1405ab2e141b" Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.305780 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" podUID="9787421c-8d35-4d30-8946-90bc71eba9c0" Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.314551 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" podUID="f527a8e5-d051-4017-80e4-e3b2f1fd59ba" Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.400004 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" event={"ID":"afbaa143-b11e-406d-b797-6ba114fbf9a4","Type":"ContainerStarted","Data":"981e5fb897eea53720583f67e3eb527c3c8093941fb1a0e369ac4e8bcf1f07bb"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.400060 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" event={"ID":"afbaa143-b11e-406d-b797-6ba114fbf9a4","Type":"ContainerStarted","Data":"bcfc3f200e90f5627110a0e6242ea1ccb42ece5fe5ec99aa7165769ffcbd6976"} Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.401402 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" podUID="afbaa143-b11e-406d-b797-6ba114fbf9a4" Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.402228 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" event={"ID":"3dba06c0-4986-438c-a553-76b0bcddd74c","Type":"ContainerStarted","Data":"0ab14958361dd9f9563334a0f031104e89b8a8efa32cc16549375f8e3ba5b381"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.403525 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" event={"ID":"6a460926-8982-40c1-b177-3620aa3dcb79","Type":"ContainerStarted","Data":"eb8f560ce6f9f609ada5d67613a850c38fb3436578f5b84cae5963ff7c073782"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.409741 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" event={"ID":"75df76ba-0998-4b89-887e-d8f0b1c546b4","Type":"ContainerStarted","Data":"52b43552b17ff9ae4b143c30a30315cec249d7272b843e25fc17bc8bd5e27bd1"} Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.411022 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" podUID="75df76ba-0998-4b89-887e-d8f0b1c546b4" Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.412046 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" event={"ID":"c92dcd56-734e-430c-813e-1405ab2e141b","Type":"ContainerStarted","Data":"08f6631643bfc470d33a3dad275c91668ff79a5bfb4d5e9b5885da6525b7e0dc"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.412109 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" event={"ID":"c92dcd56-734e-430c-813e-1405ab2e141b","Type":"ContainerStarted","Data":"e2b9f346fd6e7b8694de1ce19e4a1c61dbd8bf7e0abec5a52015070c690f716b"} Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.412986 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" podUID="c92dcd56-734e-430c-813e-1405ab2e141b" Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.413743 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" event={"ID":"9787421c-8d35-4d30-8946-90bc71eba9c0","Type":"ContainerStarted","Data":"65e7e5c72e6e7fd2c73419cb1a3d587d8320d3ce327e63725baa49c7ff7acc0b"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.413793 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" event={"ID":"9787421c-8d35-4d30-8946-90bc71eba9c0","Type":"ContainerStarted","Data":"76a094e7b52e4ac778d0bfe3f2db2af52f24382a01ccfa94fdab37bee4ea634d"} Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.416141 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" podUID="9787421c-8d35-4d30-8946-90bc71eba9c0" Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.417125 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" event={"ID":"903dfdb7-34f3-4875-8009-482cb7d5469b","Type":"ContainerStarted","Data":"37456756104ff4e48723a7e704dff91d795e91de37ebcb364ed072a997728d8d"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.417176 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" event={"ID":"903dfdb7-34f3-4875-8009-482cb7d5469b","Type":"ContainerStarted","Data":"34455dc4229c18a7694eac11a238779f25a38a119ba2d2bbf43af4a0c05457b9"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.418994 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" event={"ID":"5aeb03f1-db88-497b-b3cb-11e01e2a7b31","Type":"ContainerStarted","Data":"0bfe873173a9b22418c4055eab6658cd280cdcb0ad49e2433ab3d8690d9a4728"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.419040 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" event={"ID":"5aeb03f1-db88-497b-b3cb-11e01e2a7b31","Type":"ContainerStarted","Data":"1c8dfc14e84f3cc24694582dcf9b07fee17b7ff5dea44feae0f3f57612ee58ef"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.425621 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" event={"ID":"f527a8e5-d051-4017-80e4-e3b2f1fd59ba","Type":"ContainerStarted","Data":"2fbd2be9c99b7983586be3832591d81b41ee94fc67607fdd84ccfb90ddec8133"} Oct 02 11:33:52 crc kubenswrapper[4658]: I1002 11:33:52.425680 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" event={"ID":"f527a8e5-d051-4017-80e4-e3b2f1fd59ba","Type":"ContainerStarted","Data":"9b948176864f80a5879eea45398960f33c7af7c51390c0ee882dc375c37531de"} Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.428060 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" podUID="f527a8e5-d051-4017-80e4-e3b2f1fd59ba" Oct 02 11:33:52 crc kubenswrapper[4658]: E1002 11:33:52.433508 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" podUID="5aeb03f1-db88-497b-b3cb-11e01e2a7b31" Oct 02 11:33:53 crc kubenswrapper[4658]: I1002 11:33:53.444065 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" event={"ID":"903dfdb7-34f3-4875-8009-482cb7d5469b","Type":"ContainerStarted","Data":"efdd86fc55b3c68851114cc7d9bb382d0daa936713fd9af8e4f89572d86c31d3"} Oct 02 11:33:53 crc kubenswrapper[4658]: E1002 11:33:53.445461 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" podUID="75df76ba-0998-4b89-887e-d8f0b1c546b4" Oct 02 11:33:53 crc kubenswrapper[4658]: E1002 11:33:53.445897 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" podUID="c92dcd56-734e-430c-813e-1405ab2e141b" Oct 02 11:33:53 crc kubenswrapper[4658]: E1002 11:33:53.446027 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" podUID="9787421c-8d35-4d30-8946-90bc71eba9c0" Oct 02 11:33:53 crc kubenswrapper[4658]: E1002 11:33:53.446041 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" podUID="f527a8e5-d051-4017-80e4-e3b2f1fd59ba" Oct 02 11:33:53 crc kubenswrapper[4658]: E1002 11:33:53.446231 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" podUID="afbaa143-b11e-406d-b797-6ba114fbf9a4" Oct 02 11:33:53 crc kubenswrapper[4658]: E1002 11:33:53.446312 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" podUID="5aeb03f1-db88-497b-b3cb-11e01e2a7b31" Oct 02 11:33:53 crc kubenswrapper[4658]: I1002 11:33:53.563046 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" podStartSLOduration=4.563006337 podStartE2EDuration="4.563006337s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:33:53.562268573 +0000 UTC m=+914.453422140" watchObservedRunningTime="2025-10-02 11:33:53.563006337 +0000 UTC m=+914.454159894" Oct 02 11:33:54 crc kubenswrapper[4658]: I1002 11:33:54.452814 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:33:54 crc kubenswrapper[4658]: I1002 11:33:54.662968 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:54 crc kubenswrapper[4658]: I1002 11:33:54.663047 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:54 crc kubenswrapper[4658]: I1002 11:33:54.721555 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:55 crc kubenswrapper[4658]: I1002 11:33:55.517685 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:33:55 crc kubenswrapper[4658]: I1002 11:33:55.557031 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjtkq"] Oct 02 11:33:57 crc kubenswrapper[4658]: I1002 11:33:57.430346 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:33:57 crc kubenswrapper[4658]: I1002 11:33:57.430417 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:33:57 crc kubenswrapper[4658]: I1002 11:33:57.473739 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tjtkq" podUID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerName="registry-server" containerID="cri-o://841c49b85a53a00908e955f60a8cbde6366352ef8e87dea4124bee9242152270" gracePeriod=2 Oct 02 11:33:58 crc kubenswrapper[4658]: I1002 11:33:58.485576 4658 generic.go:334] "Generic (PLEG): container finished" podID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerID="841c49b85a53a00908e955f60a8cbde6366352ef8e87dea4124bee9242152270" exitCode=0 Oct 02 11:33:58 crc kubenswrapper[4658]: I1002 11:33:58.485686 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjtkq" event={"ID":"b2562802-cdc8-4a40-89bd-9806d6150aca","Type":"ContainerDied","Data":"841c49b85a53a00908e955f60a8cbde6366352ef8e87dea4124bee9242152270"} Oct 02 11:34:01 crc kubenswrapper[4658]: I1002 11:34:01.582509 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-f6b64f7bf-8c66j" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.025533 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.187563 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-utilities\") pod \"b2562802-cdc8-4a40-89bd-9806d6150aca\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.187734 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2mmw\" (UniqueName: \"kubernetes.io/projected/b2562802-cdc8-4a40-89bd-9806d6150aca-kube-api-access-b2mmw\") pod \"b2562802-cdc8-4a40-89bd-9806d6150aca\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.187788 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-catalog-content\") pod \"b2562802-cdc8-4a40-89bd-9806d6150aca\" (UID: \"b2562802-cdc8-4a40-89bd-9806d6150aca\") " Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.188892 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-utilities" (OuterVolumeSpecName: "utilities") pod "b2562802-cdc8-4a40-89bd-9806d6150aca" (UID: "b2562802-cdc8-4a40-89bd-9806d6150aca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.200686 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2562802-cdc8-4a40-89bd-9806d6150aca" (UID: "b2562802-cdc8-4a40-89bd-9806d6150aca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.207032 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2562802-cdc8-4a40-89bd-9806d6150aca-kube-api-access-b2mmw" (OuterVolumeSpecName: "kube-api-access-b2mmw") pod "b2562802-cdc8-4a40-89bd-9806d6150aca" (UID: "b2562802-cdc8-4a40-89bd-9806d6150aca"). InnerVolumeSpecName "kube-api-access-b2mmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.289340 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.289386 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2mmw\" (UniqueName: \"kubernetes.io/projected/b2562802-cdc8-4a40-89bd-9806d6150aca-kube-api-access-b2mmw\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.289403 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2562802-cdc8-4a40-89bd-9806d6150aca-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.534499 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjtkq" event={"ID":"b2562802-cdc8-4a40-89bd-9806d6150aca","Type":"ContainerDied","Data":"38e219544507dd5e4bf7f0713f7ec7dd1bb8a6bab5dcb509a639a4964b2448ee"} Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.534558 4658 scope.go:117] "RemoveContainer" containerID="841c49b85a53a00908e955f60a8cbde6366352ef8e87dea4124bee9242152270" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.534776 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjtkq" Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.572449 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjtkq"] Oct 02 11:34:04 crc kubenswrapper[4658]: I1002 11:34:04.577534 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjtkq"] Oct 02 11:34:05 crc kubenswrapper[4658]: I1002 11:34:05.130985 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:34:05 crc kubenswrapper[4658]: I1002 11:34:05.352726 4658 scope.go:117] "RemoveContainer" containerID="cb1ad6f6553ca53de5e016f1a946cff2f56218fbf9bcd61c276805de3da90b72" Oct 02 11:34:05 crc kubenswrapper[4658]: I1002 11:34:05.378515 4658 scope.go:117] "RemoveContainer" containerID="035e2c2e2289ca971ac92fddf5940f5f53a2507647e1b6bd41af661edfb56f6e" Oct 02 11:34:05 crc kubenswrapper[4658]: I1002 11:34:05.974594 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2562802-cdc8-4a40-89bd-9806d6150aca" path="/var/lib/kubelet/pods/b2562802-cdc8-4a40-89bd-9806d6150aca/volumes" Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.572006 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" event={"ID":"3f426838-95ca-4579-9745-e78f0ccab683","Type":"ContainerStarted","Data":"6d2c3a853973eb57d6825d96c54c4d79b14f0be88b04d8a69f2f8aee4ed5ab97"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.572078 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" event={"ID":"3f426838-95ca-4579-9745-e78f0ccab683","Type":"ContainerStarted","Data":"a26cabfbfdcd60e2a903fed8d4accb369884fbfc0bf430cb75fea1cb8c4ebd63"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.572632 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.579505 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" event={"ID":"c802dbff-c65f-40e9-91ee-3ea6f0aee6a2","Type":"ContainerStarted","Data":"b7c7e01b23f9f9d4ca8e0371b0b39e52461ee6046e6a961ee3ea36e589eaf6bc"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.600220 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" event={"ID":"55b04e2c-c701-4f74-9fb6-1dce9d2de108","Type":"ContainerStarted","Data":"95a7cedc7fce409efdb842962b85c963bcfe551092fa10471bec259cb35a4c94"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.600277 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" event={"ID":"55b04e2c-c701-4f74-9fb6-1dce9d2de108","Type":"ContainerStarted","Data":"2f92483b0fa42a778d6bad85f716ae42a760545731c1e582a72b4297d4c14361"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.601351 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.607681 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" podStartSLOduration=8.925507353 podStartE2EDuration="18.607659761s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.134548841 +0000 UTC m=+911.025702408" lastFinishedPulling="2025-10-02 11:33:59.816701249 +0000 UTC m=+920.707854816" observedRunningTime="2025-10-02 11:34:06.591626822 +0000 UTC m=+927.482780399" watchObservedRunningTime="2025-10-02 11:34:06.607659761 +0000 UTC m=+927.498813328" Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.620949 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" event={"ID":"d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b","Type":"ContainerStarted","Data":"8d2358b51eba9a0d01ddad6a2d81801fee5adaa3e5d62040c334602bdd7fee0c"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.626999 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" podStartSLOduration=5.630207601 podStartE2EDuration="18.626974185s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.929867115 +0000 UTC m=+911.821020682" lastFinishedPulling="2025-10-02 11:34:03.926633699 +0000 UTC m=+924.817787266" observedRunningTime="2025-10-02 11:34:06.622063309 +0000 UTC m=+927.513216876" watchObservedRunningTime="2025-10-02 11:34:06.626974185 +0000 UTC m=+927.518127752" Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.639418 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" event={"ID":"e9eb741d-265d-4f59-ab6e-c6a42f720801","Type":"ContainerStarted","Data":"441e2de2f0f2ad16534b445283c7d4a84db2640a6461aa8341d569a279e4854e"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.642043 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" event={"ID":"3dba06c0-4986-438c-a553-76b0bcddd74c","Type":"ContainerStarted","Data":"69c74a2fbd1656f73d3cdfdfa6a77d036acc96718c2be10884b1d8e5064ddadd"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.643403 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" event={"ID":"7744dcc1-5c52-4447-8123-53e4c98250fd","Type":"ContainerStarted","Data":"33aea2c0b9ff833b64c14aff3383ffa77a72cf9be9c0bc94391f0b3c4f4bdc4b"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.644755 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" event={"ID":"830f6e33-ad1f-4033-a725-9f10415996e7","Type":"ContainerStarted","Data":"bacda6be6ff7805330ec2e2d78d5aea417a8dc789b96fc5bc11a3798c0004059"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.653444 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" event={"ID":"6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c","Type":"ContainerStarted","Data":"c5419294dac5c527da66fd872018e36ddf4cb9586ee4ece1265e9783c33f8f02"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.669499 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" event={"ID":"7b2e2130-4b00-4242-8254-c8be160bfe89","Type":"ContainerStarted","Data":"d7c9a7a4440d0d89f366735a3595e74aae81799c4cc53de7663dee15ab48d33e"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.699191 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" event={"ID":"70026a4a-6db4-4777-afed-a5ea3de1fc60","Type":"ContainerStarted","Data":"cee1830102385130b51b95088adacb7cdec370996c14620701ec890dd3ab4e4b"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.699242 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" event={"ID":"70026a4a-6db4-4777-afed-a5ea3de1fc60","Type":"ContainerStarted","Data":"e817e2a4300911a68377d6b18ba764a2798351a8e775914903d81423b8fe86d8"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.700320 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.707181 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" event={"ID":"d480d1a6-c309-454f-8e99-a762feed8490","Type":"ContainerStarted","Data":"846ee8026391e323413829d7e15e72ec2ea3a6139831effa2c223d05d8f77087"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.759783 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" event={"ID":"6a460926-8982-40c1-b177-3620aa3dcb79","Type":"ContainerStarted","Data":"600f4c5195256578c271e55b05af7a511bd15e0c1a490df7d1f1bbd3a6b34a2b"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.795689 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" event={"ID":"bf9ac0a3-4903-4115-9793-b6bd913d4e0a","Type":"ContainerStarted","Data":"86a837fcac0629f6781379c6c485e668b6b6ebed3305524cd739fbc89d8e2873"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.833923 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" event={"ID":"33b8c756-1330-4114-bf78-2b3835667a1e","Type":"ContainerStarted","Data":"e3c4e8c8257af9db98990c898d32cc19af91de9638d645c13850ac0f1f01b628"} Oct 02 11:34:06 crc kubenswrapper[4658]: I1002 11:34:06.848442 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" event={"ID":"af944184-d59a-467d-983e-c66fb79823c6","Type":"ContainerStarted","Data":"0e22825f28759bcb1c281aa8217fd9bad3b856f34a45d4624ade4b313d7f3d9a"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.877619 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" event={"ID":"33b8c756-1330-4114-bf78-2b3835667a1e","Type":"ContainerStarted","Data":"6bf542dece98cbd73f76a37b7b90bed8b442981eaba93d8372c6fe06cf94e426"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.877747 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.880932 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" event={"ID":"d480d1a6-c309-454f-8e99-a762feed8490","Type":"ContainerStarted","Data":"05b6432a3520a5bfb538e11e310b68d0dd8eeba1e27e63168f40ef60b76b426e"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.880983 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.883279 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" event={"ID":"6a460926-8982-40c1-b177-3620aa3dcb79","Type":"ContainerStarted","Data":"34975f5787642e5fb7bbc8981eb1ecb87a557d4a25593b84a3f89c997c8be8f0"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.884019 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.894068 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" event={"ID":"af944184-d59a-467d-983e-c66fb79823c6","Type":"ContainerStarted","Data":"3998ec07424e467e274bd5453a7b61bd28c8abbc6896ab9f22e8d49a240ce235"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.894106 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.898267 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" podStartSLOduration=5.30517928 podStartE2EDuration="19.898244651s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.762796084 +0000 UTC m=+911.653949651" lastFinishedPulling="2025-10-02 11:34:05.355861455 +0000 UTC m=+926.247015022" observedRunningTime="2025-10-02 11:34:06.777529591 +0000 UTC m=+927.668683158" watchObservedRunningTime="2025-10-02 11:34:07.898244651 +0000 UTC m=+928.789398228" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.901125 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" event={"ID":"830f6e33-ad1f-4033-a725-9f10415996e7","Type":"ContainerStarted","Data":"a6510b2600833e1a2aa04e90c3398d0c4f1698933cff384bbd7c001804ce19b5"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.902029 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.911533 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" podStartSLOduration=4.847692786 podStartE2EDuration="18.911515282s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.292030709 +0000 UTC m=+912.183184276" lastFinishedPulling="2025-10-02 11:34:05.355853195 +0000 UTC m=+926.247006772" observedRunningTime="2025-10-02 11:34:07.894411048 +0000 UTC m=+928.785564615" watchObservedRunningTime="2025-10-02 11:34:07.911515282 +0000 UTC m=+928.802668849" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.912707 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" event={"ID":"7b2e2130-4b00-4242-8254-c8be160bfe89","Type":"ContainerStarted","Data":"7c18c86f044bb50992ed7c00f106cefecc253ef09c5aebaa2b7fb321ae812db5"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.914474 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.917276 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" event={"ID":"bf9ac0a3-4903-4115-9793-b6bd913d4e0a","Type":"ContainerStarted","Data":"2803dfa155a5ccb5bc045622cd5251bc59be816430869849596ecc416a9d5a0a"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.917996 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.921083 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" podStartSLOduration=5.027713229 podStartE2EDuration="19.921064096s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.486064127 +0000 UTC m=+911.377217694" lastFinishedPulling="2025-10-02 11:34:05.379414994 +0000 UTC m=+926.270568561" observedRunningTime="2025-10-02 11:34:07.910397167 +0000 UTC m=+928.801550744" watchObservedRunningTime="2025-10-02 11:34:07.921064096 +0000 UTC m=+928.812217663" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.921392 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" event={"ID":"d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b","Type":"ContainerStarted","Data":"619325d38b7cdeb59aae917680fd96dc2f9f9329c97c146beea4e58af34580fe"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.921859 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.928638 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" event={"ID":"7744dcc1-5c52-4447-8123-53e4c98250fd","Type":"ContainerStarted","Data":"32301ea57429321bc50a3367274f3cdb495f4deb2fc50fa2b9281790df566b4b"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.928799 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.929376 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" podStartSLOduration=4.910808384 podStartE2EDuration="18.92935998s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.337311679 +0000 UTC m=+912.228465246" lastFinishedPulling="2025-10-02 11:34:05.355863265 +0000 UTC m=+926.247016842" observedRunningTime="2025-10-02 11:34:07.9252889 +0000 UTC m=+928.816442477" watchObservedRunningTime="2025-10-02 11:34:07.92935998 +0000 UTC m=+928.820513547" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.932321 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" event={"ID":"e9eb741d-265d-4f59-ab6e-c6a42f720801","Type":"ContainerStarted","Data":"d701a8e67c87362f0d412b18810688882a981848cbe7f59dbc4a6afb3774bb04"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.932590 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.935424 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" event={"ID":"6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c","Type":"ContainerStarted","Data":"05b508bf567e42e4fe69272c282570c429eee8c7309eddd76cb9b630b307a8f0"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.936003 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.939250 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" event={"ID":"3dba06c0-4986-438c-a553-76b0bcddd74c","Type":"ContainerStarted","Data":"69c7d6a3ec53f083aa6bf1c46cbd5bc621cf83b7c64766e1d94ca41f616fb0de"} Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.939589 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.946482 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" podStartSLOduration=5.394084897 podStartE2EDuration="19.946435673s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.804213591 +0000 UTC m=+911.695367158" lastFinishedPulling="2025-10-02 11:34:05.356564347 +0000 UTC m=+926.247717934" observedRunningTime="2025-10-02 11:34:07.941274279 +0000 UTC m=+928.832427846" watchObservedRunningTime="2025-10-02 11:34:07.946435673 +0000 UTC m=+928.837589240" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.969108 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" podStartSLOduration=4.920931195 podStartE2EDuration="18.969086053s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.337773043 +0000 UTC m=+912.228926610" lastFinishedPulling="2025-10-02 11:34:05.385927901 +0000 UTC m=+926.277081468" observedRunningTime="2025-10-02 11:34:07.966088148 +0000 UTC m=+928.857241715" watchObservedRunningTime="2025-10-02 11:34:07.969086053 +0000 UTC m=+928.860239620" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.999227 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" Oct 02 11:34:07 crc kubenswrapper[4658]: I1002 11:34:07.999286 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" event={"ID":"c802dbff-c65f-40e9-91ee-3ea6f0aee6a2","Type":"ContainerStarted","Data":"53b75e932c2a0d3c7f496ba01674d35677e3d6f714e70a5b809ab61bfb9e36e7"} Oct 02 11:34:08 crc kubenswrapper[4658]: I1002 11:34:08.001761 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" podStartSLOduration=7.544289834 podStartE2EDuration="20.001738621s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.486334575 +0000 UTC m=+911.377488142" lastFinishedPulling="2025-10-02 11:34:02.943783362 +0000 UTC m=+923.834936929" observedRunningTime="2025-10-02 11:34:07.98880381 +0000 UTC m=+928.879957387" watchObservedRunningTime="2025-10-02 11:34:08.001738621 +0000 UTC m=+928.892892208" Oct 02 11:34:08 crc kubenswrapper[4658]: I1002 11:34:08.019508 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" podStartSLOduration=5.603564175 podStartE2EDuration="20.019485455s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.94072617 +0000 UTC m=+911.831879737" lastFinishedPulling="2025-10-02 11:34:05.35664745 +0000 UTC m=+926.247801017" observedRunningTime="2025-10-02 11:34:08.013494915 +0000 UTC m=+928.904648482" watchObservedRunningTime="2025-10-02 11:34:08.019485455 +0000 UTC m=+928.910639022" Oct 02 11:34:08 crc kubenswrapper[4658]: I1002 11:34:08.052042 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" podStartSLOduration=5.042694296 podStartE2EDuration="19.052019449s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.353885216 +0000 UTC m=+912.245038783" lastFinishedPulling="2025-10-02 11:34:05.363210369 +0000 UTC m=+926.254363936" observedRunningTime="2025-10-02 11:34:08.029196394 +0000 UTC m=+928.920349971" watchObservedRunningTime="2025-10-02 11:34:08.052019449 +0000 UTC m=+928.943173016" Oct 02 11:34:08 crc kubenswrapper[4658]: I1002 11:34:08.081650 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" podStartSLOduration=4.638086864 podStartE2EDuration="19.081628191s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.940333048 +0000 UTC m=+911.831486615" lastFinishedPulling="2025-10-02 11:34:05.383874375 +0000 UTC m=+926.275027942" observedRunningTime="2025-10-02 11:34:08.07625317 +0000 UTC m=+928.967406737" watchObservedRunningTime="2025-10-02 11:34:08.081628191 +0000 UTC m=+928.972781788" Oct 02 11:34:08 crc kubenswrapper[4658]: I1002 11:34:08.084843 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" podStartSLOduration=5.684352652 podStartE2EDuration="20.084829512s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.962714389 +0000 UTC m=+911.853867956" lastFinishedPulling="2025-10-02 11:34:05.363191229 +0000 UTC m=+926.254344816" observedRunningTime="2025-10-02 11:34:08.05863187 +0000 UTC m=+928.949785447" watchObservedRunningTime="2025-10-02 11:34:08.084829512 +0000 UTC m=+928.975983089" Oct 02 11:34:08 crc kubenswrapper[4658]: I1002 11:34:08.095477 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" podStartSLOduration=8.797150224 podStartE2EDuration="20.095461001s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.202671187 +0000 UTC m=+911.093824764" lastFinishedPulling="2025-10-02 11:34:01.500981984 +0000 UTC m=+922.392135541" observedRunningTime="2025-10-02 11:34:08.091413962 +0000 UTC m=+928.982567549" watchObservedRunningTime="2025-10-02 11:34:08.095461001 +0000 UTC m=+928.986614568" Oct 02 11:34:08 crc kubenswrapper[4658]: I1002 11:34:08.120108 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" podStartSLOduration=6.269138644 podStartE2EDuration="20.120084533s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.570657865 +0000 UTC m=+911.461811432" lastFinishedPulling="2025-10-02 11:34:04.421603754 +0000 UTC m=+925.312757321" observedRunningTime="2025-10-02 11:34:08.113809754 +0000 UTC m=+929.004963331" watchObservedRunningTime="2025-10-02 11:34:08.120084533 +0000 UTC m=+929.011238120" Oct 02 11:34:08 crc kubenswrapper[4658]: I1002 11:34:08.133730 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" podStartSLOduration=4.720366278 podStartE2EDuration="19.133707526s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:50.949437947 +0000 UTC m=+911.840591514" lastFinishedPulling="2025-10-02 11:34:05.362779185 +0000 UTC m=+926.253932762" observedRunningTime="2025-10-02 11:34:08.129945176 +0000 UTC m=+929.021098753" watchObservedRunningTime="2025-10-02 11:34:08.133707526 +0000 UTC m=+929.024861093" Oct 02 11:34:10 crc kubenswrapper[4658]: I1002 11:34:10.978421 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" event={"ID":"75df76ba-0998-4b89-887e-d8f0b1c546b4","Type":"ContainerStarted","Data":"622bb4e134b81fea57c3b6e0e2ddcdb0137e738914d0876f7c10d2e8ae686cd0"} Oct 02 11:34:10 crc kubenswrapper[4658]: I1002 11:34:10.994704 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xw82t" podStartSLOduration=3.296697608 podStartE2EDuration="21.99468561s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.35653877 +0000 UTC m=+912.247692337" lastFinishedPulling="2025-10-02 11:34:10.054526772 +0000 UTC m=+930.945680339" observedRunningTime="2025-10-02 11:34:10.993352458 +0000 UTC m=+931.884506025" watchObservedRunningTime="2025-10-02 11:34:10.99468561 +0000 UTC m=+931.885839177" Oct 02 11:34:11 crc kubenswrapper[4658]: I1002 11:34:11.988488 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" event={"ID":"9787421c-8d35-4d30-8946-90bc71eba9c0","Type":"ContainerStarted","Data":"3278b2d2b4b93c527de97269426fa1306df925ae249784ba86a49c73395b4ddb"} Oct 02 11:34:11 crc kubenswrapper[4658]: I1002 11:34:11.991324 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" event={"ID":"5aeb03f1-db88-497b-b3cb-11e01e2a7b31","Type":"ContainerStarted","Data":"8c2abf44ccb0845f8147aaf13ab7f28822b2300ddffa762a78da3db9e434486e"} Oct 02 11:34:11 crc kubenswrapper[4658]: I1002 11:34:11.993087 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" event={"ID":"f527a8e5-d051-4017-80e4-e3b2f1fd59ba","Type":"ContainerStarted","Data":"21582e51a875b1d7a1564b11fed41787b42b1de6ebaf47d74e5d6fd5d699fef5"} Oct 02 11:34:11 crc kubenswrapper[4658]: I1002 11:34:11.993317 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:34:12 crc kubenswrapper[4658]: I1002 11:34:12.011382 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" podStartSLOduration=5.309724774 podStartE2EDuration="24.011364872s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.356366234 +0000 UTC m=+912.247519801" lastFinishedPulling="2025-10-02 11:34:10.058006322 +0000 UTC m=+930.949159899" observedRunningTime="2025-10-02 11:34:12.01006907 +0000 UTC m=+932.901222637" watchObservedRunningTime="2025-10-02 11:34:12.011364872 +0000 UTC m=+932.902518429" Oct 02 11:34:13 crc kubenswrapper[4658]: I1002 11:34:13.003714 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" event={"ID":"c92dcd56-734e-430c-813e-1405ab2e141b","Type":"ContainerStarted","Data":"25a854d2e95244f4eda538324864699ff6421b045c18d3a4e1960a1dced91384"} Oct 02 11:34:13 crc kubenswrapper[4658]: I1002 11:34:13.004248 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" Oct 02 11:34:13 crc kubenswrapper[4658]: I1002 11:34:13.005647 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" event={"ID":"afbaa143-b11e-406d-b797-6ba114fbf9a4","Type":"ContainerStarted","Data":"b2c5cfeb320003bdee1fd82049a3008c6f06e28ec7716ff13858e707666510a3"} Oct 02 11:34:13 crc kubenswrapper[4658]: I1002 11:34:13.028647 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" podStartSLOduration=3.353408982 podStartE2EDuration="24.028625362s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.376246467 +0000 UTC m=+912.267400034" lastFinishedPulling="2025-10-02 11:34:12.051462847 +0000 UTC m=+932.942616414" observedRunningTime="2025-10-02 11:34:13.020582555 +0000 UTC m=+933.911736142" watchObservedRunningTime="2025-10-02 11:34:13.028625362 +0000 UTC m=+933.919778929" Oct 02 11:34:13 crc kubenswrapper[4658]: I1002 11:34:13.051701 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" podStartSLOduration=3.349804956 podStartE2EDuration="24.051675804s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.354633229 +0000 UTC m=+912.245786796" lastFinishedPulling="2025-10-02 11:34:12.056504077 +0000 UTC m=+932.947657644" observedRunningTime="2025-10-02 11:34:13.048800873 +0000 UTC m=+933.939954440" watchObservedRunningTime="2025-10-02 11:34:13.051675804 +0000 UTC m=+933.942829371" Oct 02 11:34:13 crc kubenswrapper[4658]: I1002 11:34:13.067162 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" podStartSLOduration=6.373291315 podStartE2EDuration="25.067143536s" podCreationTimestamp="2025-10-02 11:33:48 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.362959393 +0000 UTC m=+912.254112960" lastFinishedPulling="2025-10-02 11:34:10.056811614 +0000 UTC m=+930.947965181" observedRunningTime="2025-10-02 11:34:13.065259326 +0000 UTC m=+933.956412903" watchObservedRunningTime="2025-10-02 11:34:13.067143536 +0000 UTC m=+933.958297103" Oct 02 11:34:13 crc kubenswrapper[4658]: I1002 11:34:13.084698 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" podStartSLOduration=5.381789786 podStartE2EDuration="24.084681184s" podCreationTimestamp="2025-10-02 11:33:49 +0000 UTC" firstStartedPulling="2025-10-02 11:33:51.353904456 +0000 UTC m=+912.245058023" lastFinishedPulling="2025-10-02 11:34:10.056795854 +0000 UTC m=+930.947949421" observedRunningTime="2025-10-02 11:34:13.082435902 +0000 UTC m=+933.973589469" watchObservedRunningTime="2025-10-02 11:34:13.084681184 +0000 UTC m=+933.975834751" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.104685 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-kkldn" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.132624 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fgm4w" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.159276 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gckv9" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.242576 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-66q5b" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.259214 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-ljz2h" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.410820 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7mfsk" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.465345 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8ttj2" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.484448 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-l62bl" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.529816 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-tnfxq" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.553521 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.558633 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-g8dwz" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.691553 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-mhrcv" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.834477 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-wqqdv" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.893791 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-htz9g" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.920188 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.921921 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-fsnf7" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.921987 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-kbj6t" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.970788 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-ppg68" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.977110 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-4bhqs" Oct 02 11:34:19 crc kubenswrapper[4658]: I1002 11:34:19.979848 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-49k5r" Oct 02 11:34:20 crc kubenswrapper[4658]: I1002 11:34:20.001090 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7fc7d86889-mqpv9" Oct 02 11:34:20 crc kubenswrapper[4658]: I1002 11:34:20.407005 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:34:20 crc kubenswrapper[4658]: I1002 11:34:20.413510 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ffhdh" Oct 02 11:34:20 crc kubenswrapper[4658]: I1002 11:34:20.818382 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-kznvq" Oct 02 11:34:27 crc kubenswrapper[4658]: I1002 11:34:27.429657 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:34:27 crc kubenswrapper[4658]: I1002 11:34:27.430265 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.753964 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fqwlp"] Oct 02 11:34:38 crc kubenswrapper[4658]: E1002 11:34:38.754741 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerName="extract-content" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.754754 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerName="extract-content" Oct 02 11:34:38 crc kubenswrapper[4658]: E1002 11:34:38.754780 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerName="registry-server" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.754786 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerName="registry-server" Oct 02 11:34:38 crc kubenswrapper[4658]: E1002 11:34:38.754808 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerName="extract-utilities" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.754814 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerName="extract-utilities" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.754974 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2562802-cdc8-4a40-89bd-9806d6150aca" containerName="registry-server" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.755737 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.760762 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.762076 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.762905 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.763021 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wdjk8" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.781407 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fqwlp"] Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.812627 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d94kv"] Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.814223 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.816350 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.824056 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d94kv"] Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.845601 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfr5l\" (UniqueName: \"kubernetes.io/projected/c3256ab8-abfa-4828-9398-db8fea8e51d7-kube-api-access-qfr5l\") pod \"dnsmasq-dns-675f4bcbfc-fqwlp\" (UID: \"c3256ab8-abfa-4828-9398-db8fea8e51d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.845665 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-d94kv\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.845715 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqgwl\" (UniqueName: \"kubernetes.io/projected/9ea97cd4-0286-4151-9815-15c2b5839d4e-kube-api-access-rqgwl\") pod \"dnsmasq-dns-78dd6ddcc-d94kv\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.845848 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3256ab8-abfa-4828-9398-db8fea8e51d7-config\") pod \"dnsmasq-dns-675f4bcbfc-fqwlp\" (UID: \"c3256ab8-abfa-4828-9398-db8fea8e51d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.846061 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-config\") pod \"dnsmasq-dns-78dd6ddcc-d94kv\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.947535 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-config\") pod \"dnsmasq-dns-78dd6ddcc-d94kv\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.947598 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfr5l\" (UniqueName: \"kubernetes.io/projected/c3256ab8-abfa-4828-9398-db8fea8e51d7-kube-api-access-qfr5l\") pod \"dnsmasq-dns-675f4bcbfc-fqwlp\" (UID: \"c3256ab8-abfa-4828-9398-db8fea8e51d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.947625 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-d94kv\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.947651 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqgwl\" (UniqueName: \"kubernetes.io/projected/9ea97cd4-0286-4151-9815-15c2b5839d4e-kube-api-access-rqgwl\") pod \"dnsmasq-dns-78dd6ddcc-d94kv\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.947694 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3256ab8-abfa-4828-9398-db8fea8e51d7-config\") pod \"dnsmasq-dns-675f4bcbfc-fqwlp\" (UID: \"c3256ab8-abfa-4828-9398-db8fea8e51d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.948809 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-config\") pod \"dnsmasq-dns-78dd6ddcc-d94kv\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.948825 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-d94kv\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.948824 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3256ab8-abfa-4828-9398-db8fea8e51d7-config\") pod \"dnsmasq-dns-675f4bcbfc-fqwlp\" (UID: \"c3256ab8-abfa-4828-9398-db8fea8e51d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.967937 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqgwl\" (UniqueName: \"kubernetes.io/projected/9ea97cd4-0286-4151-9815-15c2b5839d4e-kube-api-access-rqgwl\") pod \"dnsmasq-dns-78dd6ddcc-d94kv\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:38 crc kubenswrapper[4658]: I1002 11:34:38.967939 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfr5l\" (UniqueName: \"kubernetes.io/projected/c3256ab8-abfa-4828-9398-db8fea8e51d7-kube-api-access-qfr5l\") pod \"dnsmasq-dns-675f4bcbfc-fqwlp\" (UID: \"c3256ab8-abfa-4828-9398-db8fea8e51d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:39 crc kubenswrapper[4658]: I1002 11:34:39.079221 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:39 crc kubenswrapper[4658]: I1002 11:34:39.135125 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:39 crc kubenswrapper[4658]: I1002 11:34:39.611027 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fqwlp"] Oct 02 11:34:39 crc kubenswrapper[4658]: I1002 11:34:39.658174 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d94kv"] Oct 02 11:34:39 crc kubenswrapper[4658]: W1002 11:34:39.663062 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ea97cd4_0286_4151_9815_15c2b5839d4e.slice/crio-cf56b7d1a4553f1d78d7dd596b77c69aa6fbc369b4fcf5ed60d190f67f44a1b4 WatchSource:0}: Error finding container cf56b7d1a4553f1d78d7dd596b77c69aa6fbc369b4fcf5ed60d190f67f44a1b4: Status 404 returned error can't find the container with id cf56b7d1a4553f1d78d7dd596b77c69aa6fbc369b4fcf5ed60d190f67f44a1b4 Oct 02 11:34:40 crc kubenswrapper[4658]: I1002 11:34:40.242238 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" event={"ID":"9ea97cd4-0286-4151-9815-15c2b5839d4e","Type":"ContainerStarted","Data":"cf56b7d1a4553f1d78d7dd596b77c69aa6fbc369b4fcf5ed60d190f67f44a1b4"} Oct 02 11:34:40 crc kubenswrapper[4658]: I1002 11:34:40.244073 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" event={"ID":"c3256ab8-abfa-4828-9398-db8fea8e51d7","Type":"ContainerStarted","Data":"d49f69666778f0fe23acb8aa4bd10509468f02967e4b7c3d7c79ba3f70788978"} Oct 02 11:34:41 crc kubenswrapper[4658]: I1002 11:34:41.902051 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fqwlp"] Oct 02 11:34:41 crc kubenswrapper[4658]: I1002 11:34:41.931569 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-l8bhf"] Oct 02 11:34:41 crc kubenswrapper[4658]: I1002 11:34:41.933518 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:41 crc kubenswrapper[4658]: I1002 11:34:41.996818 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-l8bhf"] Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.100835 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-config\") pod \"dnsmasq-dns-5ccc8479f9-l8bhf\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.100956 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-l8bhf\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.101196 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7v8l\" (UniqueName: \"kubernetes.io/projected/a3713df1-dcc1-4baa-86dd-cd001a87df5e-kube-api-access-r7v8l\") pod \"dnsmasq-dns-5ccc8479f9-l8bhf\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.202401 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-config\") pod \"dnsmasq-dns-5ccc8479f9-l8bhf\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.202452 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-l8bhf\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.202521 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7v8l\" (UniqueName: \"kubernetes.io/projected/a3713df1-dcc1-4baa-86dd-cd001a87df5e-kube-api-access-r7v8l\") pod \"dnsmasq-dns-5ccc8479f9-l8bhf\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.203624 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-config\") pod \"dnsmasq-dns-5ccc8479f9-l8bhf\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.203655 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-l8bhf\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.230018 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d94kv"] Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.238154 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7v8l\" (UniqueName: \"kubernetes.io/projected/a3713df1-dcc1-4baa-86dd-cd001a87df5e-kube-api-access-r7v8l\") pod \"dnsmasq-dns-5ccc8479f9-l8bhf\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.275523 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.311747 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hg2rp"] Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.312987 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.358358 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hg2rp"] Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.404544 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hg2rp\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.404810 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4jdj\" (UniqueName: \"kubernetes.io/projected/37b043f6-c671-43a2-9062-d2969c0253a9-kube-api-access-w4jdj\") pod \"dnsmasq-dns-57d769cc4f-hg2rp\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.404928 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-config\") pod \"dnsmasq-dns-57d769cc4f-hg2rp\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.534248 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-config\") pod \"dnsmasq-dns-57d769cc4f-hg2rp\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.534352 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hg2rp\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.534453 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4jdj\" (UniqueName: \"kubernetes.io/projected/37b043f6-c671-43a2-9062-d2969c0253a9-kube-api-access-w4jdj\") pod \"dnsmasq-dns-57d769cc4f-hg2rp\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.547443 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-config\") pod \"dnsmasq-dns-57d769cc4f-hg2rp\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.547590 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hg2rp\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.578085 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4jdj\" (UniqueName: \"kubernetes.io/projected/37b043f6-c671-43a2-9062-d2969c0253a9-kube-api-access-w4jdj\") pod \"dnsmasq-dns-57d769cc4f-hg2rp\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:42 crc kubenswrapper[4658]: I1002 11:34:42.669976 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.078378 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-l8bhf"] Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.090383 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.092668 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.095282 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tzrzj" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.095751 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.095894 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.096351 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.096567 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.096782 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.097067 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.125257 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:34:43 crc kubenswrapper[4658]: W1002 11:34:43.140514 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3713df1_dcc1_4baa_86dd_cd001a87df5e.slice/crio-d93b2d99b74ada1948ad12252c933b6ff19d5850c1c2327e8a9360dd0d063f94 WatchSource:0}: Error finding container d93b2d99b74ada1948ad12252c933b6ff19d5850c1c2327e8a9360dd0d063f94: Status 404 returned error can't find the container with id d93b2d99b74ada1948ad12252c933b6ff19d5850c1c2327e8a9360dd0d063f94 Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.217070 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hg2rp"] Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.250941 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.250999 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.251033 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.251088 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.251120 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hdmj\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-kube-api-access-5hdmj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.251153 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.251172 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.251197 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.251241 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.251265 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.251282 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.292587 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" event={"ID":"a3713df1-dcc1-4baa-86dd-cd001a87df5e","Type":"ContainerStarted","Data":"d93b2d99b74ada1948ad12252c933b6ff19d5850c1c2327e8a9360dd0d063f94"} Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.294055 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" event={"ID":"37b043f6-c671-43a2-9062-d2969c0253a9","Type":"ContainerStarted","Data":"fd8eeeb2d062a059544dafe45aa1aa0615477b29bc0b45f758074eceab48f38b"} Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352586 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352655 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352684 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352717 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352758 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352793 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352863 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352899 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hdmj\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-kube-api-access-5hdmj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352930 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352951 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.352977 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.353158 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.353183 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.354118 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.354213 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.354789 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.355818 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.364779 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.364854 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.364934 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.369330 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.371714 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hdmj\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-kube-api-access-5hdmj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.379355 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.419609 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.509943 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.511212 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.515100 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.515960 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.516283 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.516453 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.516563 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.516649 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.516881 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-v52cg" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.526576 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.656925 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.656978 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aa01b90-7cce-4e10-ac37-57df39a56df1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.657000 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aa01b90-7cce-4e10-ac37-57df39a56df1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.657019 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.657110 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-config-data\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.657161 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.657224 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.657274 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.657322 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2flp\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-kube-api-access-x2flp\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.657348 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.657402 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.758906 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.758948 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-config-data\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.758972 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.759012 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.759036 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.759064 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2flp\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-kube-api-access-x2flp\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.759086 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.759117 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.759143 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.759165 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aa01b90-7cce-4e10-ac37-57df39a56df1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.759184 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aa01b90-7cce-4e10-ac37-57df39a56df1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.759353 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.760611 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.761023 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-config-data\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.761315 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.764927 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.765185 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.767503 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.771186 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aa01b90-7cce-4e10-ac37-57df39a56df1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.771335 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.772572 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aa01b90-7cce-4e10-ac37-57df39a56df1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.789183 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.789795 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2flp\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-kube-api-access-x2flp\") pod \"rabbitmq-server-0\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " pod="openstack/rabbitmq-server-0" Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.795369 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:34:43 crc kubenswrapper[4658]: I1002 11:34:43.845018 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.214320 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.218628 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.221501 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.221704 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8mgzf" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.222878 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.223903 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.226155 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.229905 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.231029 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.303127 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/590179b8-356d-4392-bab5-037103481383-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.303237 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/590179b8-356d-4392-bab5-037103481383-config-data-generated\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.303270 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/590179b8-356d-4392-bab5-037103481383-config-data-default\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.303288 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590179b8-356d-4392-bab5-037103481383-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.303327 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/590179b8-356d-4392-bab5-037103481383-kolla-config\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.303404 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.303441 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/590179b8-356d-4392-bab5-037103481383-secrets\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.303483 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/590179b8-356d-4392-bab5-037103481383-operator-scripts\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.303505 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc2th\" (UniqueName: \"kubernetes.io/projected/590179b8-356d-4392-bab5-037103481383-kube-api-access-pc2th\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.375101 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.376409 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.378811 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.379002 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-kn8l5" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.379471 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.379476 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.394818 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405575 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecaec123-d0cf-493f-bee4-b32cd4f084bf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405636 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/590179b8-356d-4392-bab5-037103481383-config-data-generated\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405669 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ecaec123-d0cf-493f-bee4-b32cd4f084bf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405706 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/590179b8-356d-4392-bab5-037103481383-config-data-default\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405732 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590179b8-356d-4392-bab5-037103481383-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405762 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/590179b8-356d-4392-bab5-037103481383-kolla-config\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405781 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ecaec123-d0cf-493f-bee4-b32cd4f084bf-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405834 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405864 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaec123-d0cf-493f-bee4-b32cd4f084bf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405898 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ecaec123-d0cf-493f-bee4-b32cd4f084bf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405929 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/590179b8-356d-4392-bab5-037103481383-secrets\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405959 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ecaec123-d0cf-493f-bee4-b32cd4f084bf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.405986 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecaec123-d0cf-493f-bee4-b32cd4f084bf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.406006 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdprh\" (UniqueName: \"kubernetes.io/projected/ecaec123-d0cf-493f-bee4-b32cd4f084bf-kube-api-access-mdprh\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.406035 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/590179b8-356d-4392-bab5-037103481383-operator-scripts\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.406061 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc2th\" (UniqueName: \"kubernetes.io/projected/590179b8-356d-4392-bab5-037103481383-kube-api-access-pc2th\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.406094 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/590179b8-356d-4392-bab5-037103481383-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.406116 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.406772 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/590179b8-356d-4392-bab5-037103481383-config-data-generated\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.409085 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/590179b8-356d-4392-bab5-037103481383-kolla-config\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.409196 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.411322 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/590179b8-356d-4392-bab5-037103481383-operator-scripts\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.415735 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/590179b8-356d-4392-bab5-037103481383-config-data-default\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.425923 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590179b8-356d-4392-bab5-037103481383-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.426315 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/590179b8-356d-4392-bab5-037103481383-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.427454 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/590179b8-356d-4392-bab5-037103481383-secrets\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.443186 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc2th\" (UniqueName: \"kubernetes.io/projected/590179b8-356d-4392-bab5-037103481383-kube-api-access-pc2th\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.452601 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"590179b8-356d-4392-bab5-037103481383\") " pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.507614 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.507978 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecaec123-d0cf-493f-bee4-b32cd4f084bf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.508121 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ecaec123-d0cf-493f-bee4-b32cd4f084bf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.508240 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ecaec123-d0cf-493f-bee4-b32cd4f084bf-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.508364 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaec123-d0cf-493f-bee4-b32cd4f084bf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.508492 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ecaec123-d0cf-493f-bee4-b32cd4f084bf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.508603 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ecaec123-d0cf-493f-bee4-b32cd4f084bf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.508695 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecaec123-d0cf-493f-bee4-b32cd4f084bf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.508784 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdprh\" (UniqueName: \"kubernetes.io/projected/ecaec123-d0cf-493f-bee4-b32cd4f084bf-kube-api-access-mdprh\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.507872 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.510690 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaec123-d0cf-493f-bee4-b32cd4f084bf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.511382 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ecaec123-d0cf-493f-bee4-b32cd4f084bf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.511454 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ecaec123-d0cf-493f-bee4-b32cd4f084bf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.511503 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ecaec123-d0cf-493f-bee4-b32cd4f084bf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.514413 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecaec123-d0cf-493f-bee4-b32cd4f084bf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.516883 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ecaec123-d0cf-493f-bee4-b32cd4f084bf-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.517763 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecaec123-d0cf-493f-bee4-b32cd4f084bf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.537050 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdprh\" (UniqueName: \"kubernetes.io/projected/ecaec123-d0cf-493f-bee4-b32cd4f084bf-kube-api-access-mdprh\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.545740 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ecaec123-d0cf-493f-bee4-b32cd4f084bf\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.554996 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.587891 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.588831 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.591342 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.591350 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.591501 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-psjmf" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.600560 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.610643 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f3cc404-a92f-4ef8-a799-83eb314e4382-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.610682 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxjpf\" (UniqueName: \"kubernetes.io/projected/3f3cc404-a92f-4ef8-a799-83eb314e4382-kube-api-access-pxjpf\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.610720 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3cc404-a92f-4ef8-a799-83eb314e4382-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.610748 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f3cc404-a92f-4ef8-a799-83eb314e4382-config-data\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.610786 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f3cc404-a92f-4ef8-a799-83eb314e4382-kolla-config\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.703412 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.712267 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f3cc404-a92f-4ef8-a799-83eb314e4382-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.712343 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxjpf\" (UniqueName: \"kubernetes.io/projected/3f3cc404-a92f-4ef8-a799-83eb314e4382-kube-api-access-pxjpf\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.712399 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3cc404-a92f-4ef8-a799-83eb314e4382-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.712437 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f3cc404-a92f-4ef8-a799-83eb314e4382-config-data\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.713469 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f3cc404-a92f-4ef8-a799-83eb314e4382-config-data\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.713486 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f3cc404-a92f-4ef8-a799-83eb314e4382-kolla-config\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.712512 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f3cc404-a92f-4ef8-a799-83eb314e4382-kolla-config\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.715678 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f3cc404-a92f-4ef8-a799-83eb314e4382-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.716260 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3cc404-a92f-4ef8-a799-83eb314e4382-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.739384 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxjpf\" (UniqueName: \"kubernetes.io/projected/3f3cc404-a92f-4ef8-a799-83eb314e4382-kube-api-access-pxjpf\") pod \"memcached-0\" (UID: \"3f3cc404-a92f-4ef8-a799-83eb314e4382\") " pod="openstack/memcached-0" Oct 02 11:34:46 crc kubenswrapper[4658]: I1002 11:34:46.908329 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:34:47 crc kubenswrapper[4658]: W1002 11:34:47.834809 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cc6649a_7a89_4658_9a2d_a09cb4f5f860.slice/crio-9189d902497667418e1e0de6261191f11ca4858e09eb937e77281963dba3794b WatchSource:0}: Error finding container 9189d902497667418e1e0de6261191f11ca4858e09eb937e77281963dba3794b: Status 404 returned error can't find the container with id 9189d902497667418e1e0de6261191f11ca4858e09eb937e77281963dba3794b Oct 02 11:34:48 crc kubenswrapper[4658]: I1002 11:34:48.348435 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cc6649a-7a89-4658-9a2d-a09cb4f5f860","Type":"ContainerStarted","Data":"9189d902497667418e1e0de6261191f11ca4858e09eb937e77281963dba3794b"} Oct 02 11:34:48 crc kubenswrapper[4658]: I1002 11:34:48.951707 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:34:48 crc kubenswrapper[4658]: I1002 11:34:48.952985 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:34:48 crc kubenswrapper[4658]: I1002 11:34:48.957049 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2p9ff" Oct 02 11:34:48 crc kubenswrapper[4658]: I1002 11:34:48.970515 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:34:49 crc kubenswrapper[4658]: I1002 11:34:49.058520 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rq8s\" (UniqueName: \"kubernetes.io/projected/3d138ce0-7164-4e2f-9690-83719e55b301-kube-api-access-5rq8s\") pod \"kube-state-metrics-0\" (UID: \"3d138ce0-7164-4e2f-9690-83719e55b301\") " pod="openstack/kube-state-metrics-0" Oct 02 11:34:49 crc kubenswrapper[4658]: I1002 11:34:49.160559 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rq8s\" (UniqueName: \"kubernetes.io/projected/3d138ce0-7164-4e2f-9690-83719e55b301-kube-api-access-5rq8s\") pod \"kube-state-metrics-0\" (UID: \"3d138ce0-7164-4e2f-9690-83719e55b301\") " pod="openstack/kube-state-metrics-0" Oct 02 11:34:49 crc kubenswrapper[4658]: I1002 11:34:49.191420 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rq8s\" (UniqueName: \"kubernetes.io/projected/3d138ce0-7164-4e2f-9690-83719e55b301-kube-api-access-5rq8s\") pod \"kube-state-metrics-0\" (UID: \"3d138ce0-7164-4e2f-9690-83719e55b301\") " pod="openstack/kube-state-metrics-0" Oct 02 11:34:49 crc kubenswrapper[4658]: I1002 11:34:49.274502 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.289307 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.294252 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.298215 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.300864 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.301066 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.301186 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.301348 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.301542 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5lk2c" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.307441 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.485767 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.486122 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4544e55-087c-4095-be50-820df44e0a48-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.486167 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.486198 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.486272 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6mbx\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-kube-api-access-h6mbx\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.486322 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.486382 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f4544e55-087c-4095-be50-820df44e0a48-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.486428 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.587657 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mbx\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-kube-api-access-h6mbx\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.587713 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.587764 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f4544e55-087c-4095-be50-820df44e0a48-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.587797 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.587833 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.587847 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4544e55-087c-4095-be50-820df44e0a48-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.587873 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.587894 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.590492 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f4544e55-087c-4095-be50-820df44e0a48-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.592067 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.593002 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4544e55-087c-4095-be50-820df44e0a48-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.593268 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.593438 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.593555 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.596506 4658 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.596562 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b2727623f8bbe474018a880a77329ded2fae90762c86c59a9726b562d3cbf13f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.616087 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mbx\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-kube-api-access-h6mbx\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.636471 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:50 crc kubenswrapper[4658]: I1002 11:34:50.931056 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.152186 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-h2htr"] Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.154004 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.155806 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ck4mm" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.157118 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.157431 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.161303 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h2htr"] Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.184681 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tbnj8"] Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.187208 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.206877 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tbnj8"] Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.226834 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ed2f1df6-db7a-483e-a80d-298f12a389c8-var-run\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.226896 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2f1df6-db7a-483e-a80d-298f12a389c8-ovn-controller-tls-certs\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.226931 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed2f1df6-db7a-483e-a80d-298f12a389c8-scripts\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.226959 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thxjs\" (UniqueName: \"kubernetes.io/projected/ff110d7e-a1dd-4a53-99c8-995af4a9d039-kube-api-access-thxjs\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.226996 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gvhc\" (UniqueName: \"kubernetes.io/projected/ed2f1df6-db7a-483e-a80d-298f12a389c8-kube-api-access-2gvhc\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.227045 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2f1df6-db7a-483e-a80d-298f12a389c8-var-log-ovn\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.227070 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2f1df6-db7a-483e-a80d-298f12a389c8-var-run-ovn\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.227091 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-var-run\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.227130 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-var-lib\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.227174 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-etc-ovs\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.227214 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2f1df6-db7a-483e-a80d-298f12a389c8-combined-ca-bundle\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.227247 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-var-log\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.227274 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff110d7e-a1dd-4a53-99c8-995af4a9d039-scripts\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.328688 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2f1df6-db7a-483e-a80d-298f12a389c8-ovn-controller-tls-certs\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.328757 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed2f1df6-db7a-483e-a80d-298f12a389c8-scripts\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.328792 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thxjs\" (UniqueName: \"kubernetes.io/projected/ff110d7e-a1dd-4a53-99c8-995af4a9d039-kube-api-access-thxjs\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.328815 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gvhc\" (UniqueName: \"kubernetes.io/projected/ed2f1df6-db7a-483e-a80d-298f12a389c8-kube-api-access-2gvhc\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.328879 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2f1df6-db7a-483e-a80d-298f12a389c8-var-log-ovn\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.328902 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2f1df6-db7a-483e-a80d-298f12a389c8-var-run-ovn\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.328929 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-var-run\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.328970 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-var-lib\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.329013 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-etc-ovs\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.329045 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2f1df6-db7a-483e-a80d-298f12a389c8-combined-ca-bundle\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.329077 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-var-log\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.329628 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff110d7e-a1dd-4a53-99c8-995af4a9d039-scripts\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.329695 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ed2f1df6-db7a-483e-a80d-298f12a389c8-var-run\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.330057 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ed2f1df6-db7a-483e-a80d-298f12a389c8-var-run\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.330058 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-var-run\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.330269 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2f1df6-db7a-483e-a80d-298f12a389c8-var-run-ovn\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.330397 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-var-lib\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.330746 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2f1df6-db7a-483e-a80d-298f12a389c8-var-log-ovn\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.330921 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-etc-ovs\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.332193 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ff110d7e-a1dd-4a53-99c8-995af4a9d039-var-log\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.333031 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed2f1df6-db7a-483e-a80d-298f12a389c8-scripts\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.334934 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff110d7e-a1dd-4a53-99c8-995af4a9d039-scripts\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.336377 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2f1df6-db7a-483e-a80d-298f12a389c8-combined-ca-bundle\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.348319 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thxjs\" (UniqueName: \"kubernetes.io/projected/ff110d7e-a1dd-4a53-99c8-995af4a9d039-kube-api-access-thxjs\") pod \"ovn-controller-ovs-tbnj8\" (UID: \"ff110d7e-a1dd-4a53-99c8-995af4a9d039\") " pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.348861 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2f1df6-db7a-483e-a80d-298f12a389c8-ovn-controller-tls-certs\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.349413 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gvhc\" (UniqueName: \"kubernetes.io/projected/ed2f1df6-db7a-483e-a80d-298f12a389c8-kube-api-access-2gvhc\") pod \"ovn-controller-h2htr\" (UID: \"ed2f1df6-db7a-483e-a80d-298f12a389c8\") " pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.478947 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h2htr" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.511776 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.599918 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.603063 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.605086 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.605407 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cjcx4" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.605575 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.605682 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.606681 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.608913 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.737888 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.738048 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hhh2\" (UniqueName: \"kubernetes.io/projected/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-kube-api-access-5hhh2\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.738093 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.738125 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.738176 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.738210 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.738237 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-config\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.738273 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.839260 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.839324 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-config\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.839361 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.839398 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.839485 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hhh2\" (UniqueName: \"kubernetes.io/projected/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-kube-api-access-5hhh2\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.839517 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.839540 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.839581 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.839987 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.840865 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-config\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.840869 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.841107 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.850611 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.854612 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.855012 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.860440 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hhh2\" (UniqueName: \"kubernetes.io/projected/e17b8e1f-e0a9-4648-b16b-1f62fa63d507-kube-api-access-5hhh2\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.875503 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e17b8e1f-e0a9-4648-b16b-1f62fa63d507\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:52 crc kubenswrapper[4658]: I1002 11:34:52.921615 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.314925 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.316621 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.318413 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.318557 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-g7bsc" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.319898 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.320209 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.329961 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.502642 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.502707 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44a349ce-b770-4e0a-bc23-afb9bdea6eba-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.502757 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44a349ce-b770-4e0a-bc23-afb9bdea6eba-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.502881 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r86gb\" (UniqueName: \"kubernetes.io/projected/44a349ce-b770-4e0a-bc23-afb9bdea6eba-kube-api-access-r86gb\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.502927 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a349ce-b770-4e0a-bc23-afb9bdea6eba-config\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.503203 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44a349ce-b770-4e0a-bc23-afb9bdea6eba-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.503225 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a349ce-b770-4e0a-bc23-afb9bdea6eba-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.503266 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44a349ce-b770-4e0a-bc23-afb9bdea6eba-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.605369 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.605410 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44a349ce-b770-4e0a-bc23-afb9bdea6eba-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.605443 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44a349ce-b770-4e0a-bc23-afb9bdea6eba-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.605483 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r86gb\" (UniqueName: \"kubernetes.io/projected/44a349ce-b770-4e0a-bc23-afb9bdea6eba-kube-api-access-r86gb\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.605509 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a349ce-b770-4e0a-bc23-afb9bdea6eba-config\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.605531 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44a349ce-b770-4e0a-bc23-afb9bdea6eba-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.605552 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a349ce-b770-4e0a-bc23-afb9bdea6eba-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.605570 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44a349ce-b770-4e0a-bc23-afb9bdea6eba-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.605947 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.606099 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44a349ce-b770-4e0a-bc23-afb9bdea6eba-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.606760 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44a349ce-b770-4e0a-bc23-afb9bdea6eba-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.607471 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a349ce-b770-4e0a-bc23-afb9bdea6eba-config\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.614240 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44a349ce-b770-4e0a-bc23-afb9bdea6eba-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.618715 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44a349ce-b770-4e0a-bc23-afb9bdea6eba-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.620953 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a349ce-b770-4e0a-bc23-afb9bdea6eba-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.628404 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r86gb\" (UniqueName: \"kubernetes.io/projected/44a349ce-b770-4e0a-bc23-afb9bdea6eba-kube-api-access-r86gb\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.633479 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"44a349ce-b770-4e0a-bc23-afb9bdea6eba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:56 crc kubenswrapper[4658]: I1002 11:34:56.639536 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:57 crc kubenswrapper[4658]: I1002 11:34:57.170163 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 11:34:57 crc kubenswrapper[4658]: I1002 11:34:57.430150 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:34:57 crc kubenswrapper[4658]: I1002 11:34:57.430215 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:34:57 crc kubenswrapper[4658]: I1002 11:34:57.430259 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:34:57 crc kubenswrapper[4658]: I1002 11:34:57.430910 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d11d8049b244ab8835831d1427eb5be75c611efce4e7cb5b809ccc2a5ccfd02a"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:34:57 crc kubenswrapper[4658]: I1002 11:34:57.430967 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://d11d8049b244ab8835831d1427eb5be75c611efce4e7cb5b809ccc2a5ccfd02a" gracePeriod=600 Oct 02 11:34:57 crc kubenswrapper[4658]: E1002 11:34:57.561240 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:34:57 crc kubenswrapper[4658]: E1002 11:34:57.561420 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qfr5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-fqwlp_openstack(c3256ab8-abfa-4828-9398-db8fea8e51d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:34:57 crc kubenswrapper[4658]: E1002 11:34:57.562826 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" podUID="c3256ab8-abfa-4828-9398-db8fea8e51d7" Oct 02 11:34:57 crc kubenswrapper[4658]: E1002 11:34:57.564533 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:34:57 crc kubenswrapper[4658]: E1002 11:34:57.564677 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqgwl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-d94kv_openstack(9ea97cd4-0286-4151-9815-15c2b5839d4e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:34:57 crc kubenswrapper[4658]: E1002 11:34:57.565866 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" podUID="9ea97cd4-0286-4151-9815-15c2b5839d4e" Oct 02 11:34:58 crc kubenswrapper[4658]: I1002 11:34:58.431422 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="d11d8049b244ab8835831d1427eb5be75c611efce4e7cb5b809ccc2a5ccfd02a" exitCode=0 Oct 02 11:34:58 crc kubenswrapper[4658]: I1002 11:34:58.431500 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"d11d8049b244ab8835831d1427eb5be75c611efce4e7cb5b809ccc2a5ccfd02a"} Oct 02 11:34:58 crc kubenswrapper[4658]: I1002 11:34:58.431549 4658 scope.go:117] "RemoveContainer" containerID="2bbea38b7c4b625206d3cc6d00d2f3c0a2ccd06911eb1caf35974de1edfbf91d" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.071626 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.102022 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.151775 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqgwl\" (UniqueName: \"kubernetes.io/projected/9ea97cd4-0286-4151-9815-15c2b5839d4e-kube-api-access-rqgwl\") pod \"9ea97cd4-0286-4151-9815-15c2b5839d4e\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.151825 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-dns-svc\") pod \"9ea97cd4-0286-4151-9815-15c2b5839d4e\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.151852 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-config\") pod \"9ea97cd4-0286-4151-9815-15c2b5839d4e\" (UID: \"9ea97cd4-0286-4151-9815-15c2b5839d4e\") " Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.152514 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ea97cd4-0286-4151-9815-15c2b5839d4e" (UID: "9ea97cd4-0286-4151-9815-15c2b5839d4e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.152527 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-config" (OuterVolumeSpecName: "config") pod "9ea97cd4-0286-4151-9815-15c2b5839d4e" (UID: "9ea97cd4-0286-4151-9815-15c2b5839d4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.157389 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea97cd4-0286-4151-9815-15c2b5839d4e-kube-api-access-rqgwl" (OuterVolumeSpecName: "kube-api-access-rqgwl") pod "9ea97cd4-0286-4151-9815-15c2b5839d4e" (UID: "9ea97cd4-0286-4151-9815-15c2b5839d4e"). InnerVolumeSpecName "kube-api-access-rqgwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.253371 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfr5l\" (UniqueName: \"kubernetes.io/projected/c3256ab8-abfa-4828-9398-db8fea8e51d7-kube-api-access-qfr5l\") pod \"c3256ab8-abfa-4828-9398-db8fea8e51d7\" (UID: \"c3256ab8-abfa-4828-9398-db8fea8e51d7\") " Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.253464 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3256ab8-abfa-4828-9398-db8fea8e51d7-config\") pod \"c3256ab8-abfa-4828-9398-db8fea8e51d7\" (UID: \"c3256ab8-abfa-4828-9398-db8fea8e51d7\") " Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.253787 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqgwl\" (UniqueName: \"kubernetes.io/projected/9ea97cd4-0286-4151-9815-15c2b5839d4e-kube-api-access-rqgwl\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.253799 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.253808 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea97cd4-0286-4151-9815-15c2b5839d4e-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.254151 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3256ab8-abfa-4828-9398-db8fea8e51d7-config" (OuterVolumeSpecName: "config") pod "c3256ab8-abfa-4828-9398-db8fea8e51d7" (UID: "c3256ab8-abfa-4828-9398-db8fea8e51d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.262239 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3256ab8-abfa-4828-9398-db8fea8e51d7-kube-api-access-qfr5l" (OuterVolumeSpecName: "kube-api-access-qfr5l") pod "c3256ab8-abfa-4828-9398-db8fea8e51d7" (UID: "c3256ab8-abfa-4828-9398-db8fea8e51d7"). InnerVolumeSpecName "kube-api-access-qfr5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.355570 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfr5l\" (UniqueName: \"kubernetes.io/projected/c3256ab8-abfa-4828-9398-db8fea8e51d7-kube-api-access-qfr5l\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.355856 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3256ab8-abfa-4828-9398-db8fea8e51d7-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.446149 4658 generic.go:334] "Generic (PLEG): container finished" podID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" containerID="949d27d0962578b6fb31447110c54c128d01ad7ac8d9ebcda6955dd068991e2e" exitCode=0 Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.446241 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" event={"ID":"a3713df1-dcc1-4baa-86dd-cd001a87df5e","Type":"ContainerDied","Data":"949d27d0962578b6fb31447110c54c128d01ad7ac8d9ebcda6955dd068991e2e"} Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.448135 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3f3cc404-a92f-4ef8-a799-83eb314e4382","Type":"ContainerStarted","Data":"7f271e12e50a0d469938b7bedf9c1e7baf703106aed29c258e4ec8a0f4bd785b"} Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.449905 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" event={"ID":"9ea97cd4-0286-4151-9815-15c2b5839d4e","Type":"ContainerDied","Data":"cf56b7d1a4553f1d78d7dd596b77c69aa6fbc369b4fcf5ed60d190f67f44a1b4"} Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.449973 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-d94kv" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.452972 4658 generic.go:334] "Generic (PLEG): container finished" podID="37b043f6-c671-43a2-9062-d2969c0253a9" containerID="b00b7e97de47c65ec16441dcc377ff098c48bccf44e592c0aaad90cdd26ed224" exitCode=0 Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.453114 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" event={"ID":"37b043f6-c671-43a2-9062-d2969c0253a9","Type":"ContainerDied","Data":"b00b7e97de47c65ec16441dcc377ff098c48bccf44e592c0aaad90cdd26ed224"} Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.458547 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" event={"ID":"c3256ab8-abfa-4828-9398-db8fea8e51d7","Type":"ContainerDied","Data":"d49f69666778f0fe23acb8aa4bd10509468f02967e4b7c3d7c79ba3f70788978"} Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.458570 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fqwlp" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.475980 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"070d9ca89b2be9f5cb302e4464d452f6af7427a486ef0fedb26718058c812952"} Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.577349 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fqwlp"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.598952 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fqwlp"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.625417 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.625491 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.633719 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d94kv"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.638831 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d94kv"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.738908 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.761197 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h2htr"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.776187 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.798239 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.847887 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tbnj8"] Oct 02 11:34:59 crc kubenswrapper[4658]: W1002 11:34:59.851903 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecaec123_d0cf_493f_bee4_b32cd4f084bf.slice/crio-6e6151f4e0a2e8bb027ff9dcf7fc61c35cac50a67e3ca5603f789624ee23a03c WatchSource:0}: Error finding container 6e6151f4e0a2e8bb027ff9dcf7fc61c35cac50a67e3ca5603f789624ee23a03c: Status 404 returned error can't find the container with id 6e6151f4e0a2e8bb027ff9dcf7fc61c35cac50a67e3ca5603f789624ee23a03c Oct 02 11:34:59 crc kubenswrapper[4658]: W1002 11:34:59.853694 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d138ce0_7164_4e2f_9690_83719e55b301.slice/crio-ffbd2a528cc1d177d7f9a2f7321ba8730afd162b4bf0dea4dfde069809ba4491 WatchSource:0}: Error finding container ffbd2a528cc1d177d7f9a2f7321ba8730afd162b4bf0dea4dfde069809ba4491: Status 404 returned error can't find the container with id ffbd2a528cc1d177d7f9a2f7321ba8730afd162b4bf0dea4dfde069809ba4491 Oct 02 11:34:59 crc kubenswrapper[4658]: W1002 11:34:59.864794 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff110d7e_a1dd_4a53_99c8_995af4a9d039.slice/crio-69b994ee77a4094178231c9c4742af3758580852ddc8dd75f29ae0be6e27f7e0 WatchSource:0}: Error finding container 69b994ee77a4094178231c9c4742af3758580852ddc8dd75f29ae0be6e27f7e0: Status 404 returned error can't find the container with id 69b994ee77a4094178231c9c4742af3758580852ddc8dd75f29ae0be6e27f7e0 Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.967364 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea97cd4-0286-4151-9815-15c2b5839d4e" path="/var/lib/kubelet/pods/9ea97cd4-0286-4151-9815-15c2b5839d4e/volumes" Oct 02 11:34:59 crc kubenswrapper[4658]: I1002 11:34:59.967819 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3256ab8-abfa-4828-9398-db8fea8e51d7" path="/var/lib/kubelet/pods/c3256ab8-abfa-4828-9398-db8fea8e51d7/volumes" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.432836 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rjq6k"] Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.434161 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.437883 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.455148 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rjq6k"] Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.499634 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d138ce0-7164-4e2f-9690-83719e55b301","Type":"ContainerStarted","Data":"ffbd2a528cc1d177d7f9a2f7321ba8730afd162b4bf0dea4dfde069809ba4491"} Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.501354 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"590179b8-356d-4392-bab5-037103481383","Type":"ContainerStarted","Data":"012ca3113e1e939a62afb4bb21738f353d39d9a3b0f3ba73dc1bd36384bb6521"} Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.503720 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8aa01b90-7cce-4e10-ac37-57df39a56df1","Type":"ContainerStarted","Data":"9c7072a35270fcb05ddffab44861d3786fd1c97783db9a19b3db1b7b220031e5"} Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.505818 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tbnj8" event={"ID":"ff110d7e-a1dd-4a53-99c8-995af4a9d039","Type":"ContainerStarted","Data":"69b994ee77a4094178231c9c4742af3758580852ddc8dd75f29ae0be6e27f7e0"} Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.507374 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ecaec123-d0cf-493f-bee4-b32cd4f084bf","Type":"ContainerStarted","Data":"6e6151f4e0a2e8bb027ff9dcf7fc61c35cac50a67e3ca5603f789624ee23a03c"} Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.511781 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cc6649a-7a89-4658-9a2d-a09cb4f5f860","Type":"ContainerStarted","Data":"8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73"} Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.523859 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h2htr" event={"ID":"ed2f1df6-db7a-483e-a80d-298f12a389c8","Type":"ContainerStarted","Data":"f2281e975de9100c4529966aabdf56ffa1f71e53304f5b39216fe6a5644caeb6"} Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.527711 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" event={"ID":"37b043f6-c671-43a2-9062-d2969c0253a9","Type":"ContainerStarted","Data":"608efdb34234573e4bb1f230ad3e2310cbd43b125bc7a055df498c4cd6d8fb4d"} Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.527916 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.530407 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerStarted","Data":"45ce70dfa87bf45be43bea17eccb78fc18bde1c7dd85e0383b8de78b862643e2"} Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.567017 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" podStartSLOduration=3.15011928 podStartE2EDuration="18.566992921s" podCreationTimestamp="2025-10-02 11:34:42 +0000 UTC" firstStartedPulling="2025-10-02 11:34:43.227221439 +0000 UTC m=+964.118375006" lastFinishedPulling="2025-10-02 11:34:58.64409508 +0000 UTC m=+979.535248647" observedRunningTime="2025-10-02 11:35:00.559966617 +0000 UTC m=+981.451120184" watchObservedRunningTime="2025-10-02 11:35:00.566992921 +0000 UTC m=+981.458146488" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.582937 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313d8a11-a864-4fe8-b083-cc3f713cd4f7-config\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.583026 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zz9b\" (UniqueName: \"kubernetes.io/projected/313d8a11-a864-4fe8-b083-cc3f713cd4f7-kube-api-access-6zz9b\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.583046 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313d8a11-a864-4fe8-b083-cc3f713cd4f7-combined-ca-bundle\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.583067 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/313d8a11-a864-4fe8-b083-cc3f713cd4f7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.583108 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/313d8a11-a864-4fe8-b083-cc3f713cd4f7-ovn-rundir\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.583133 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/313d8a11-a864-4fe8-b083-cc3f713cd4f7-ovs-rundir\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.652619 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.685336 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zz9b\" (UniqueName: \"kubernetes.io/projected/313d8a11-a864-4fe8-b083-cc3f713cd4f7-kube-api-access-6zz9b\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.685379 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313d8a11-a864-4fe8-b083-cc3f713cd4f7-combined-ca-bundle\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.685411 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/313d8a11-a864-4fe8-b083-cc3f713cd4f7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.685448 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/313d8a11-a864-4fe8-b083-cc3f713cd4f7-ovn-rundir\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.685497 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/313d8a11-a864-4fe8-b083-cc3f713cd4f7-ovs-rundir\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.685574 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313d8a11-a864-4fe8-b083-cc3f713cd4f7-config\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.686251 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/313d8a11-a864-4fe8-b083-cc3f713cd4f7-ovs-rundir\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.686633 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/313d8a11-a864-4fe8-b083-cc3f713cd4f7-ovn-rundir\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.687201 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313d8a11-a864-4fe8-b083-cc3f713cd4f7-config\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.691898 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/313d8a11-a864-4fe8-b083-cc3f713cd4f7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.697689 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313d8a11-a864-4fe8-b083-cc3f713cd4f7-combined-ca-bundle\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.703022 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zz9b\" (UniqueName: \"kubernetes.io/projected/313d8a11-a864-4fe8-b083-cc3f713cd4f7-kube-api-access-6zz9b\") pod \"ovn-controller-metrics-rjq6k\" (UID: \"313d8a11-a864-4fe8-b083-cc3f713cd4f7\") " pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:00 crc kubenswrapper[4658]: I1002 11:35:00.762760 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rjq6k" Oct 02 11:35:01 crc kubenswrapper[4658]: W1002 11:35:01.330187 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode17b8e1f_e0a9_4648_b16b_1f62fa63d507.slice/crio-42818e4c69c878740c390d8a948fe45436cb97bf93f6e1a5e1496c175f00eb16 WatchSource:0}: Error finding container 42818e4c69c878740c390d8a948fe45436cb97bf93f6e1a5e1496c175f00eb16: Status 404 returned error can't find the container with id 42818e4c69c878740c390d8a948fe45436cb97bf93f6e1a5e1496c175f00eb16 Oct 02 11:35:01 crc kubenswrapper[4658]: I1002 11:35:01.400713 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:35:01 crc kubenswrapper[4658]: W1002 11:35:01.408406 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a349ce_b770_4e0a_bc23_afb9bdea6eba.slice/crio-e3e65f61cce7c620e526439f490086fe5442ed65b3525c7d66b30f1ca8628366 WatchSource:0}: Error finding container e3e65f61cce7c620e526439f490086fe5442ed65b3525c7d66b30f1ca8628366: Status 404 returned error can't find the container with id e3e65f61cce7c620e526439f490086fe5442ed65b3525c7d66b30f1ca8628366 Oct 02 11:35:01 crc kubenswrapper[4658]: I1002 11:35:01.556167 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e17b8e1f-e0a9-4648-b16b-1f62fa63d507","Type":"ContainerStarted","Data":"42818e4c69c878740c390d8a948fe45436cb97bf93f6e1a5e1496c175f00eb16"} Oct 02 11:35:01 crc kubenswrapper[4658]: I1002 11:35:01.561623 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" event={"ID":"a3713df1-dcc1-4baa-86dd-cd001a87df5e","Type":"ContainerStarted","Data":"66232855eac66fdbaede4df2af4d394b37633f33fe713cc846bdf47f3fa855a2"} Oct 02 11:35:01 crc kubenswrapper[4658]: I1002 11:35:01.562117 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:35:01 crc kubenswrapper[4658]: I1002 11:35:01.572188 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"44a349ce-b770-4e0a-bc23-afb9bdea6eba","Type":"ContainerStarted","Data":"e3e65f61cce7c620e526439f490086fe5442ed65b3525c7d66b30f1ca8628366"} Oct 02 11:35:01 crc kubenswrapper[4658]: I1002 11:35:01.835586 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" podStartSLOduration=5.250974858 podStartE2EDuration="20.8355634s" podCreationTimestamp="2025-10-02 11:34:41 +0000 UTC" firstStartedPulling="2025-10-02 11:34:43.148088753 +0000 UTC m=+964.039242320" lastFinishedPulling="2025-10-02 11:34:58.732677295 +0000 UTC m=+979.623830862" observedRunningTime="2025-10-02 11:35:01.585645365 +0000 UTC m=+982.476798932" watchObservedRunningTime="2025-10-02 11:35:01.8355634 +0000 UTC m=+982.726716967" Oct 02 11:35:01 crc kubenswrapper[4658]: I1002 11:35:01.841199 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rjq6k"] Oct 02 11:35:02 crc kubenswrapper[4658]: W1002 11:35:02.191489 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod313d8a11_a864_4fe8_b083_cc3f713cd4f7.slice/crio-aa8f4974381427159080c20a1c6060f7de0fdb1d7bd172b51afa095c1e1b5ce6 WatchSource:0}: Error finding container aa8f4974381427159080c20a1c6060f7de0fdb1d7bd172b51afa095c1e1b5ce6: Status 404 returned error can't find the container with id aa8f4974381427159080c20a1c6060f7de0fdb1d7bd172b51afa095c1e1b5ce6 Oct 02 11:35:02 crc kubenswrapper[4658]: I1002 11:35:02.584676 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3f3cc404-a92f-4ef8-a799-83eb314e4382","Type":"ContainerStarted","Data":"505d5f5a62bff8cb5da9b2e2705828d2a9664b1a792b4a3e4f4f1cd82ea76c5a"} Oct 02 11:35:02 crc kubenswrapper[4658]: I1002 11:35:02.584810 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 02 11:35:02 crc kubenswrapper[4658]: I1002 11:35:02.587735 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rjq6k" event={"ID":"313d8a11-a864-4fe8-b083-cc3f713cd4f7","Type":"ContainerStarted","Data":"aa8f4974381427159080c20a1c6060f7de0fdb1d7bd172b51afa095c1e1b5ce6"} Oct 02 11:35:02 crc kubenswrapper[4658]: I1002 11:35:02.590912 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8aa01b90-7cce-4e10-ac37-57df39a56df1","Type":"ContainerStarted","Data":"962021eda53525352e51f7521305c62cf0b06e8762581492eb65c40a47f21d30"} Oct 02 11:35:02 crc kubenswrapper[4658]: I1002 11:35:02.606027 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.814460238 podStartE2EDuration="16.606005634s" podCreationTimestamp="2025-10-02 11:34:46 +0000 UTC" firstStartedPulling="2025-10-02 11:34:58.556091592 +0000 UTC m=+979.447245159" lastFinishedPulling="2025-10-02 11:35:01.347636988 +0000 UTC m=+982.238790555" observedRunningTime="2025-10-02 11:35:02.602725109 +0000 UTC m=+983.493878696" watchObservedRunningTime="2025-10-02 11:35:02.606005634 +0000 UTC m=+983.497159201" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.313657 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-l8bhf"] Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.316380 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" podUID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" containerName="dnsmasq-dns" containerID="cri-o://66232855eac66fdbaede4df2af4d394b37633f33fe713cc846bdf47f3fa855a2" gracePeriod=10 Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.318401 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.340519 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f7zb8"] Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.341874 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.343717 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.368217 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f7zb8"] Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.379883 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.385752 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.386183 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-config\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.386427 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxzpm\" (UniqueName: \"kubernetes.io/projected/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-kube-api-access-cxzpm\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.487778 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-config\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.488102 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxzpm\" (UniqueName: \"kubernetes.io/projected/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-kube-api-access-cxzpm\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.488237 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.489380 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.489112 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-config\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.489052 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.490346 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.517887 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxzpm\" (UniqueName: \"kubernetes.io/projected/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-kube-api-access-cxzpm\") pod \"dnsmasq-dns-7fd796d7df-f7zb8\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.638310 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hg2rp"] Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.638548 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" podUID="37b043f6-c671-43a2-9062-d2969c0253a9" containerName="dnsmasq-dns" containerID="cri-o://608efdb34234573e4bb1f230ad3e2310cbd43b125bc7a055df498c4cd6d8fb4d" gracePeriod=10 Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.640452 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.642819 4658 generic.go:334] "Generic (PLEG): container finished" podID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" containerID="66232855eac66fdbaede4df2af4d394b37633f33fe713cc846bdf47f3fa855a2" exitCode=0 Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.642851 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" event={"ID":"a3713df1-dcc1-4baa-86dd-cd001a87df5e","Type":"ContainerDied","Data":"66232855eac66fdbaede4df2af4d394b37633f33fe713cc846bdf47f3fa855a2"} Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.682183 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-96xvt"] Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.692635 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.696627 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.709645 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-96xvt"] Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.723988 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.793525 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-config\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.793663 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g762m\" (UniqueName: \"kubernetes.io/projected/b5acc72c-89e2-455e-ada7-aa71d35b1c20-kube-api-access-g762m\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.793743 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.793880 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.793927 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.894924 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.894970 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.895037 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-config\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.895176 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g762m\" (UniqueName: \"kubernetes.io/projected/b5acc72c-89e2-455e-ada7-aa71d35b1c20-kube-api-access-g762m\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.895202 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.895932 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.897145 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.897271 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-config\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.897648 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.909665 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 02 11:35:06 crc kubenswrapper[4658]: I1002 11:35:06.918000 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g762m\" (UniqueName: \"kubernetes.io/projected/b5acc72c-89e2-455e-ada7-aa71d35b1c20-kube-api-access-g762m\") pod \"dnsmasq-dns-86db49b7ff-96xvt\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:07 crc kubenswrapper[4658]: I1002 11:35:07.070621 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:07 crc kubenswrapper[4658]: I1002 11:35:07.276327 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" podUID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.105:5353: connect: connection refused" Oct 02 11:35:07 crc kubenswrapper[4658]: I1002 11:35:07.651705 4658 generic.go:334] "Generic (PLEG): container finished" podID="37b043f6-c671-43a2-9062-d2969c0253a9" containerID="608efdb34234573e4bb1f230ad3e2310cbd43b125bc7a055df498c4cd6d8fb4d" exitCode=0 Oct 02 11:35:07 crc kubenswrapper[4658]: I1002 11:35:07.651750 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" event={"ID":"37b043f6-c671-43a2-9062-d2969c0253a9","Type":"ContainerDied","Data":"608efdb34234573e4bb1f230ad3e2310cbd43b125bc7a055df498c4cd6d8fb4d"} Oct 02 11:35:07 crc kubenswrapper[4658]: I1002 11:35:07.671987 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" podUID="37b043f6-c671-43a2-9062-d2969c0253a9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.106:5353: connect: connection refused" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.662357 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" event={"ID":"37b043f6-c671-43a2-9062-d2969c0253a9","Type":"ContainerDied","Data":"fd8eeeb2d062a059544dafe45aa1aa0615477b29bc0b45f758074eceab48f38b"} Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.662887 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd8eeeb2d062a059544dafe45aa1aa0615477b29bc0b45f758074eceab48f38b" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.683689 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.727983 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-dns-svc\") pod \"37b043f6-c671-43a2-9062-d2969c0253a9\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.728307 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4jdj\" (UniqueName: \"kubernetes.io/projected/37b043f6-c671-43a2-9062-d2969c0253a9-kube-api-access-w4jdj\") pod \"37b043f6-c671-43a2-9062-d2969c0253a9\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.729528 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-config\") pod \"37b043f6-c671-43a2-9062-d2969c0253a9\" (UID: \"37b043f6-c671-43a2-9062-d2969c0253a9\") " Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.732276 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b043f6-c671-43a2-9062-d2969c0253a9-kube-api-access-w4jdj" (OuterVolumeSpecName: "kube-api-access-w4jdj") pod "37b043f6-c671-43a2-9062-d2969c0253a9" (UID: "37b043f6-c671-43a2-9062-d2969c0253a9"). InnerVolumeSpecName "kube-api-access-w4jdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.771761 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37b043f6-c671-43a2-9062-d2969c0253a9" (UID: "37b043f6-c671-43a2-9062-d2969c0253a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.776419 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-config" (OuterVolumeSpecName: "config") pod "37b043f6-c671-43a2-9062-d2969c0253a9" (UID: "37b043f6-c671-43a2-9062-d2969c0253a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.832423 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.832683 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b043f6-c671-43a2-9062-d2969c0253a9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.832697 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4jdj\" (UniqueName: \"kubernetes.io/projected/37b043f6-c671-43a2-9062-d2969c0253a9-kube-api-access-w4jdj\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.901061 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.937883 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-dns-svc\") pod \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.937960 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-config\") pod \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.938000 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7v8l\" (UniqueName: \"kubernetes.io/projected/a3713df1-dcc1-4baa-86dd-cd001a87df5e-kube-api-access-r7v8l\") pod \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\" (UID: \"a3713df1-dcc1-4baa-86dd-cd001a87df5e\") " Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.944942 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3713df1-dcc1-4baa-86dd-cd001a87df5e-kube-api-access-r7v8l" (OuterVolumeSpecName: "kube-api-access-r7v8l") pod "a3713df1-dcc1-4baa-86dd-cd001a87df5e" (UID: "a3713df1-dcc1-4baa-86dd-cd001a87df5e"). InnerVolumeSpecName "kube-api-access-r7v8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.982360 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-config" (OuterVolumeSpecName: "config") pod "a3713df1-dcc1-4baa-86dd-cd001a87df5e" (UID: "a3713df1-dcc1-4baa-86dd-cd001a87df5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:08 crc kubenswrapper[4658]: I1002 11:35:08.987196 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3713df1-dcc1-4baa-86dd-cd001a87df5e" (UID: "a3713df1-dcc1-4baa-86dd-cd001a87df5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.040179 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.040220 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3713df1-dcc1-4baa-86dd-cd001a87df5e-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.040233 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7v8l\" (UniqueName: \"kubernetes.io/projected/a3713df1-dcc1-4baa-86dd-cd001a87df5e-kube-api-access-r7v8l\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.370134 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f7zb8"] Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.413961 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-kf47n"] Oct 02 11:35:09 crc kubenswrapper[4658]: E1002 11:35:09.414418 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" containerName="dnsmasq-dns" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.414436 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" containerName="dnsmasq-dns" Oct 02 11:35:09 crc kubenswrapper[4658]: E1002 11:35:09.414444 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b043f6-c671-43a2-9062-d2969c0253a9" containerName="init" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.414452 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b043f6-c671-43a2-9062-d2969c0253a9" containerName="init" Oct 02 11:35:09 crc kubenswrapper[4658]: E1002 11:35:09.414467 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b043f6-c671-43a2-9062-d2969c0253a9" containerName="dnsmasq-dns" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.414475 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b043f6-c671-43a2-9062-d2969c0253a9" containerName="dnsmasq-dns" Oct 02 11:35:09 crc kubenswrapper[4658]: E1002 11:35:09.414509 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" containerName="init" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.414517 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" containerName="init" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.414721 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b043f6-c671-43a2-9062-d2969c0253a9" containerName="dnsmasq-dns" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.414753 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" containerName="dnsmasq-dns" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.415785 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.435108 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kf47n"] Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.449211 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-config\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.449488 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.449519 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4g7c\" (UniqueName: \"kubernetes.io/projected/abdb68cf-d07d-4e71-9489-236c44f58641-kube-api-access-m4g7c\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.449998 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-dns-svc\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.450035 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.517454 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-96xvt"] Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.551878 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-dns-svc\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.551939 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.552050 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-config\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.552078 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.552102 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4g7c\" (UniqueName: \"kubernetes.io/projected/abdb68cf-d07d-4e71-9489-236c44f58641-kube-api-access-m4g7c\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.553585 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-dns-svc\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.553588 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.553655 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-config\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.554393 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.571472 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4g7c\" (UniqueName: \"kubernetes.io/projected/abdb68cf-d07d-4e71-9489-236c44f58641-kube-api-access-m4g7c\") pod \"dnsmasq-dns-698758b865-kf47n\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.641047 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.806721 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hg2rp" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.810399 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.810395 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-l8bhf" event={"ID":"a3713df1-dcc1-4baa-86dd-cd001a87df5e","Type":"ContainerDied","Data":"d93b2d99b74ada1948ad12252c933b6ff19d5850c1c2327e8a9360dd0d063f94"} Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.810853 4658 scope.go:117] "RemoveContainer" containerID="66232855eac66fdbaede4df2af4d394b37633f33fe713cc846bdf47f3fa855a2" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.840461 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f7zb8"] Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.848344 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-l8bhf"] Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.858432 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-l8bhf"] Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.863615 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hg2rp"] Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.868996 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hg2rp"] Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.959492 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b043f6-c671-43a2-9062-d2969c0253a9" path="/var/lib/kubelet/pods/37b043f6-c671-43a2-9062-d2969c0253a9/volumes" Oct 02 11:35:09 crc kubenswrapper[4658]: I1002 11:35:09.960069 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3713df1-dcc1-4baa-86dd-cd001a87df5e" path="/var/lib/kubelet/pods/a3713df1-dcc1-4baa-86dd-cd001a87df5e/volumes" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.293847 4658 scope.go:117] "RemoveContainer" containerID="949d27d0962578b6fb31447110c54c128d01ad7ac8d9ebcda6955dd068991e2e" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.488238 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.503542 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.506074 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rxlkg" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.506460 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.507193 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.507807 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.508138 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.614733 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.616437 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.616521 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6d0e9bcc-e466-4017-92b9-d12e55fc7953-cache\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.616615 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6d0e9bcc-e466-4017-92b9-d12e55fc7953-lock\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.616652 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7klst\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-kube-api-access-7klst\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.720021 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.720090 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.720235 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6d0e9bcc-e466-4017-92b9-d12e55fc7953-cache\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.720475 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6d0e9bcc-e466-4017-92b9-d12e55fc7953-lock\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.720505 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7klst\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-kube-api-access-7klst\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.720636 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.721201 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6d0e9bcc-e466-4017-92b9-d12e55fc7953-cache\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: E1002 11:35:10.721334 4658 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:35:10 crc kubenswrapper[4658]: E1002 11:35:10.721351 4658 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:35:10 crc kubenswrapper[4658]: E1002 11:35:10.721402 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift podName:6d0e9bcc-e466-4017-92b9-d12e55fc7953 nodeName:}" failed. No retries permitted until 2025-10-02 11:35:11.221384291 +0000 UTC m=+992.112537858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift") pod "swift-storage-0" (UID: "6d0e9bcc-e466-4017-92b9-d12e55fc7953") : configmap "swift-ring-files" not found Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.721409 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6d0e9bcc-e466-4017-92b9-d12e55fc7953-lock\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.743789 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7klst\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-kube-api-access-7klst\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.860123 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" event={"ID":"dadb9a0f-9e17-4171-af8c-c47fae2db1a1","Type":"ContainerStarted","Data":"597ae9a0cb75c2adb666fc840f5bc4d708b175d8898cf43da2a2c1bde18c6e65"} Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.863620 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"590179b8-356d-4392-bab5-037103481383","Type":"ContainerStarted","Data":"bb608a940d1d9a1235c40ea8fe1309cbb2481466c7de1dd6d29d26c5eedcac1d"} Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.865223 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" event={"ID":"b5acc72c-89e2-455e-ada7-aa71d35b1c20","Type":"ContainerStarted","Data":"0bd1eafed9337e6a4ecff8f1cb3280359be115aed0dfc4188584319b97169b0b"} Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.869571 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e17b8e1f-e0a9-4648-b16b-1f62fa63d507","Type":"ContainerStarted","Data":"ef9b9eff382a7f68f31fa06972cd8c05e50eab0b663d5b2e4f191465bf33fafb"} Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.872054 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ecaec123-d0cf-493f-bee4-b32cd4f084bf","Type":"ContainerStarted","Data":"e720ad13b45f3485d88f673a774cff628dd7c2f6d490976ec4eaed02992410e7"} Oct 02 11:35:10 crc kubenswrapper[4658]: I1002 11:35:10.992247 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-fkzqr"] Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.007871 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.013377 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.014054 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.016288 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fkzqr"] Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.029597 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.032684 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-swiftconf\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.033759 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-ring-data-devices\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.033844 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-scripts\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.033975 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-combined-ca-bundle\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.034894 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5ddw\" (UniqueName: \"kubernetes.io/projected/0909c66f-f3c6-440c-add2-8784d1c209c7-kube-api-access-t5ddw\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.035016 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-dispersionconf\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.035083 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0909c66f-f3c6-440c-add2-8784d1c209c7-etc-swift\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.040792 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kf47n"] Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.060674 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.136405 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5ddw\" (UniqueName: \"kubernetes.io/projected/0909c66f-f3c6-440c-add2-8784d1c209c7-kube-api-access-t5ddw\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.136478 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-dispersionconf\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.136517 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0909c66f-f3c6-440c-add2-8784d1c209c7-etc-swift\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.136557 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-swiftconf\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.136602 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-ring-data-devices\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.136638 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-scripts\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.136677 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-combined-ca-bundle\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.138386 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-scripts\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.138387 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-ring-data-devices\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.138785 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0909c66f-f3c6-440c-add2-8784d1c209c7-etc-swift\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.145236 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-combined-ca-bundle\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.147945 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-swiftconf\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.148985 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-dispersionconf\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.173503 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5ddw\" (UniqueName: \"kubernetes.io/projected/0909c66f-f3c6-440c-add2-8784d1c209c7-kube-api-access-t5ddw\") pod \"swift-ring-rebalance-fkzqr\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.238435 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:11 crc kubenswrapper[4658]: E1002 11:35:11.238612 4658 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:35:11 crc kubenswrapper[4658]: E1002 11:35:11.238629 4658 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:35:11 crc kubenswrapper[4658]: E1002 11:35:11.238672 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift podName:6d0e9bcc-e466-4017-92b9-d12e55fc7953 nodeName:}" failed. No retries permitted until 2025-10-02 11:35:12.238658636 +0000 UTC m=+993.129812203 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift") pod "swift-storage-0" (UID: "6d0e9bcc-e466-4017-92b9-d12e55fc7953") : configmap "swift-ring-files" not found Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.340158 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.885530 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h2htr" event={"ID":"ed2f1df6-db7a-483e-a80d-298f12a389c8","Type":"ContainerStarted","Data":"8f485d628ae99a196e268ee44b2dc17a42c6adf6d99f77256f9fb3f06dfafdbf"} Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.888487 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d138ce0-7164-4e2f-9690-83719e55b301","Type":"ContainerStarted","Data":"2a485927051fda1e74c45cc634305f6bea335369dbb3237494fd65af75528a2b"} Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.893280 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" event={"ID":"dadb9a0f-9e17-4171-af8c-c47fae2db1a1","Type":"ContainerStarted","Data":"097f4fcd7bae26358cdb2e40bd8aa5c46f6189dbc3d55e1b479fb93f10842cce"} Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.895427 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" event={"ID":"b5acc72c-89e2-455e-ada7-aa71d35b1c20","Type":"ContainerStarted","Data":"8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2"} Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.905987 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e17b8e1f-e0a9-4648-b16b-1f62fa63d507","Type":"ContainerStarted","Data":"544305502b91751c0f6f3b2f5b486a855f69a90bbf295efbbe653afd2633482f"} Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.910349 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tbnj8" event={"ID":"ff110d7e-a1dd-4a53-99c8-995af4a9d039","Type":"ContainerStarted","Data":"65a01f255098ded16db1d27fda46d7c36725bced85bbe3359d1d184d275bad8e"} Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.918580 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"44a349ce-b770-4e0a-bc23-afb9bdea6eba","Type":"ContainerStarted","Data":"fa7b712950f46266635b29ef7d41297b4ddeb2de29ebfe83b2d507ec7faeff60"} Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.920053 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rjq6k" event={"ID":"313d8a11-a864-4fe8-b083-cc3f713cd4f7","Type":"ContainerStarted","Data":"da0fd1c449c74aecf754230ef9aae8e3c15dd401a704d92c2caac8a9296fb4a4"} Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.926422 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kf47n" event={"ID":"abdb68cf-d07d-4e71-9489-236c44f58641","Type":"ContainerStarted","Data":"ab4586289fb9b0ece49b57be10f63f4276688d10a12a58375cdc6baf24eba721"} Oct 02 11:35:11 crc kubenswrapper[4658]: I1002 11:35:11.996592 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fkzqr"] Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.265566 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:12 crc kubenswrapper[4658]: E1002 11:35:12.265786 4658 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:35:12 crc kubenswrapper[4658]: E1002 11:35:12.266009 4658 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:35:12 crc kubenswrapper[4658]: E1002 11:35:12.266120 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift podName:6d0e9bcc-e466-4017-92b9-d12e55fc7953 nodeName:}" failed. No retries permitted until 2025-10-02 11:35:14.266099259 +0000 UTC m=+995.157252826 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift") pod "swift-storage-0" (UID: "6d0e9bcc-e466-4017-92b9-d12e55fc7953") : configmap "swift-ring-files" not found Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.936113 4658 generic.go:334] "Generic (PLEG): container finished" podID="ff110d7e-a1dd-4a53-99c8-995af4a9d039" containerID="65a01f255098ded16db1d27fda46d7c36725bced85bbe3359d1d184d275bad8e" exitCode=0 Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.936391 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tbnj8" event={"ID":"ff110d7e-a1dd-4a53-99c8-995af4a9d039","Type":"ContainerDied","Data":"65a01f255098ded16db1d27fda46d7c36725bced85bbe3359d1d184d275bad8e"} Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.939169 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"44a349ce-b770-4e0a-bc23-afb9bdea6eba","Type":"ContainerStarted","Data":"d0f56fe56b76ca331b3a1be4840302bea1b7ec13a1594e12bce0e92fbd57dfe9"} Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.942804 4658 generic.go:334] "Generic (PLEG): container finished" podID="dadb9a0f-9e17-4171-af8c-c47fae2db1a1" containerID="097f4fcd7bae26358cdb2e40bd8aa5c46f6189dbc3d55e1b479fb93f10842cce" exitCode=0 Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.942870 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" event={"ID":"dadb9a0f-9e17-4171-af8c-c47fae2db1a1","Type":"ContainerDied","Data":"097f4fcd7bae26358cdb2e40bd8aa5c46f6189dbc3d55e1b479fb93f10842cce"} Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.948783 4658 generic.go:334] "Generic (PLEG): container finished" podID="abdb68cf-d07d-4e71-9489-236c44f58641" containerID="7ca5fb5c5ef318f07b1c5526af2b9ad68ed85d4da8d94a8f89735b2cb09bc1d5" exitCode=0 Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.948866 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kf47n" event={"ID":"abdb68cf-d07d-4e71-9489-236c44f58641","Type":"ContainerDied","Data":"7ca5fb5c5ef318f07b1c5526af2b9ad68ed85d4da8d94a8f89735b2cb09bc1d5"} Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.951430 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fkzqr" event={"ID":"0909c66f-f3c6-440c-add2-8784d1c209c7","Type":"ContainerStarted","Data":"e4ef8b2a7f43f48d4d3b2f784abff760ac167efa8ce65d08d9df06fc5fa62d6e"} Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.953796 4658 generic.go:334] "Generic (PLEG): container finished" podID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" containerID="8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2" exitCode=0 Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.953850 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" event={"ID":"b5acc72c-89e2-455e-ada7-aa71d35b1c20","Type":"ContainerDied","Data":"8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2"} Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.966228 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerStarted","Data":"b1679924fa14ef08b2595b7568d88e7f15b09384631dbf3c1591288012ee5b6d"} Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.966280 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-h2htr" Oct 02 11:35:12 crc kubenswrapper[4658]: I1002 11:35:12.966322 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.065487 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.868409839 podStartE2EDuration="18.065469853s" podCreationTimestamp="2025-10-02 11:34:55 +0000 UTC" firstStartedPulling="2025-10-02 11:35:01.417192379 +0000 UTC m=+982.308345946" lastFinishedPulling="2025-10-02 11:35:08.614252393 +0000 UTC m=+989.505405960" observedRunningTime="2025-10-02 11:35:13.054951608 +0000 UTC m=+993.946105175" watchObservedRunningTime="2025-10-02 11:35:13.065469853 +0000 UTC m=+993.956623420" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.092625 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-h2htr" podStartSLOduration=12.8939659 podStartE2EDuration="21.092603115s" podCreationTimestamp="2025-10-02 11:34:52 +0000 UTC" firstStartedPulling="2025-10-02 11:34:59.877049146 +0000 UTC m=+980.768202713" lastFinishedPulling="2025-10-02 11:35:08.075686361 +0000 UTC m=+988.966839928" observedRunningTime="2025-10-02 11:35:13.085638103 +0000 UTC m=+993.976791680" watchObservedRunningTime="2025-10-02 11:35:13.092603115 +0000 UTC m=+993.983756682" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.111352 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rjq6k" podStartSLOduration=5.853716263 podStartE2EDuration="13.111333701s" podCreationTimestamp="2025-10-02 11:35:00 +0000 UTC" firstStartedPulling="2025-10-02 11:35:02.194413419 +0000 UTC m=+983.085566986" lastFinishedPulling="2025-10-02 11:35:09.452030857 +0000 UTC m=+990.343184424" observedRunningTime="2025-10-02 11:35:13.107837229 +0000 UTC m=+993.998990816" watchObservedRunningTime="2025-10-02 11:35:13.111333701 +0000 UTC m=+994.002487268" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.189324 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.909384622 podStartE2EDuration="22.189284369s" podCreationTimestamp="2025-10-02 11:34:51 +0000 UTC" firstStartedPulling="2025-10-02 11:35:01.334066217 +0000 UTC m=+982.225219784" lastFinishedPulling="2025-10-02 11:35:08.613965974 +0000 UTC m=+989.505119531" observedRunningTime="2025-10-02 11:35:13.18649787 +0000 UTC m=+994.077651437" watchObservedRunningTime="2025-10-02 11:35:13.189284369 +0000 UTC m=+994.080437936" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.215657 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.584540151 podStartE2EDuration="25.215635887s" podCreationTimestamp="2025-10-02 11:34:48 +0000 UTC" firstStartedPulling="2025-10-02 11:34:59.856010798 +0000 UTC m=+980.747164365" lastFinishedPulling="2025-10-02 11:35:10.487106534 +0000 UTC m=+991.378260101" observedRunningTime="2025-10-02 11:35:13.209180411 +0000 UTC m=+994.100333978" watchObservedRunningTime="2025-10-02 11:35:13.215635887 +0000 UTC m=+994.106789454" Oct 02 11:35:13 crc kubenswrapper[4658]: E1002 11:35:13.309703 4658 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 02 11:35:13 crc kubenswrapper[4658]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b5acc72c-89e2-455e-ada7-aa71d35b1c20/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 02 11:35:13 crc kubenswrapper[4658]: > podSandboxID="0bd1eafed9337e6a4ecff8f1cb3280359be115aed0dfc4188584319b97169b0b" Oct 02 11:35:13 crc kubenswrapper[4658]: E1002 11:35:13.310109 4658 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 02 11:35:13 crc kubenswrapper[4658]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g762m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-96xvt_openstack(b5acc72c-89e2-455e-ada7-aa71d35b1c20): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b5acc72c-89e2-455e-ada7-aa71d35b1c20/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 02 11:35:13 crc kubenswrapper[4658]: > logger="UnhandledError" Oct 02 11:35:13 crc kubenswrapper[4658]: E1002 11:35:13.311612 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b5acc72c-89e2-455e-ada7-aa71d35b1c20/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" podUID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.481887 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:13 crc kubenswrapper[4658]: E1002 11:35:13.574426 4658 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5acc72c_89e2_455e_ada7_aa71d35b1c20.slice/crio-662a8ac322459bbfea056d21d7b0c10390a430deca7a27c78aa466d3c7f461c0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5acc72c_89e2_455e_ada7_aa71d35b1c20.slice/crio-conmon-662a8ac322459bbfea056d21d7b0c10390a430deca7a27c78aa466d3c7f461c0.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.610725 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-ovsdbserver-nb\") pod \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.613636 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-config\") pod \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.613715 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-dns-svc\") pod \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.613787 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxzpm\" (UniqueName: \"kubernetes.io/projected/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-kube-api-access-cxzpm\") pod \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\" (UID: \"dadb9a0f-9e17-4171-af8c-c47fae2db1a1\") " Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.618423 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-kube-api-access-cxzpm" (OuterVolumeSpecName: "kube-api-access-cxzpm") pod "dadb9a0f-9e17-4171-af8c-c47fae2db1a1" (UID: "dadb9a0f-9e17-4171-af8c-c47fae2db1a1"). InnerVolumeSpecName "kube-api-access-cxzpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.639333 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dadb9a0f-9e17-4171-af8c-c47fae2db1a1" (UID: "dadb9a0f-9e17-4171-af8c-c47fae2db1a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.641052 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-config" (OuterVolumeSpecName: "config") pod "dadb9a0f-9e17-4171-af8c-c47fae2db1a1" (UID: "dadb9a0f-9e17-4171-af8c-c47fae2db1a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.642106 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dadb9a0f-9e17-4171-af8c-c47fae2db1a1" (UID: "dadb9a0f-9e17-4171-af8c-c47fae2db1a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.716282 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.716337 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxzpm\" (UniqueName: \"kubernetes.io/projected/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-kube-api-access-cxzpm\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.716356 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.716367 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadb9a0f-9e17-4171-af8c-c47fae2db1a1-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.922822 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.976733 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.981677 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tbnj8" event={"ID":"ff110d7e-a1dd-4a53-99c8-995af4a9d039","Type":"ContainerStarted","Data":"81a902ba3c6b4eab5aef69a7ecfb6c206ba68cd4a722af04a1dfac89f0c9a4f7"} Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.981727 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tbnj8" event={"ID":"ff110d7e-a1dd-4a53-99c8-995af4a9d039","Type":"ContainerStarted","Data":"85e7d361efa78ed37ae90de5d47b736bbce5c5d9d671c0c0ba73e8dcc8da6e00"} Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.981853 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.985402 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" event={"ID":"dadb9a0f-9e17-4171-af8c-c47fae2db1a1","Type":"ContainerDied","Data":"597ae9a0cb75c2adb666fc840f5bc4d708b175d8898cf43da2a2c1bde18c6e65"} Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.985409 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-f7zb8" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.985452 4658 scope.go:117] "RemoveContainer" containerID="097f4fcd7bae26358cdb2e40bd8aa5c46f6189dbc3d55e1b479fb93f10842cce" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.990086 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kf47n" event={"ID":"abdb68cf-d07d-4e71-9489-236c44f58641","Type":"ContainerStarted","Data":"643f7791d269fb64976908c814c130765ee1b587b4f3901db7bc1580d007f634"} Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.990791 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 02 11:35:13 crc kubenswrapper[4658]: I1002 11:35:13.990819 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:14 crc kubenswrapper[4658]: I1002 11:35:14.062593 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f7zb8"] Oct 02 11:35:14 crc kubenswrapper[4658]: I1002 11:35:14.069183 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f7zb8"] Oct 02 11:35:14 crc kubenswrapper[4658]: I1002 11:35:14.074927 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tbnj8" podStartSLOduration=13.887865877 podStartE2EDuration="22.074909164s" podCreationTimestamp="2025-10-02 11:34:52 +0000 UTC" firstStartedPulling="2025-10-02 11:34:59.877525041 +0000 UTC m=+980.768678608" lastFinishedPulling="2025-10-02 11:35:08.064568328 +0000 UTC m=+988.955721895" observedRunningTime="2025-10-02 11:35:14.071264228 +0000 UTC m=+994.962417815" watchObservedRunningTime="2025-10-02 11:35:14.074909164 +0000 UTC m=+994.966062731" Oct 02 11:35:14 crc kubenswrapper[4658]: I1002 11:35:14.326130 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:14 crc kubenswrapper[4658]: E1002 11:35:14.326358 4658 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:35:14 crc kubenswrapper[4658]: E1002 11:35:14.326386 4658 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:35:14 crc kubenswrapper[4658]: E1002 11:35:14.326447 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift podName:6d0e9bcc-e466-4017-92b9-d12e55fc7953 nodeName:}" failed. No retries permitted until 2025-10-02 11:35:18.32642981 +0000 UTC m=+999.217583377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift") pod "swift-storage-0" (UID: "6d0e9bcc-e466-4017-92b9-d12e55fc7953") : configmap "swift-ring-files" not found Oct 02 11:35:14 crc kubenswrapper[4658]: I1002 11:35:14.640155 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 02 11:35:14 crc kubenswrapper[4658]: I1002 11:35:14.682593 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 02 11:35:14 crc kubenswrapper[4658]: I1002 11:35:14.702506 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-kf47n" podStartSLOduration=5.702485735 podStartE2EDuration="5.702485735s" podCreationTimestamp="2025-10-02 11:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:35:14.090051776 +0000 UTC m=+994.981205343" watchObservedRunningTime="2025-10-02 11:35:14.702485735 +0000 UTC m=+995.593639302" Oct 02 11:35:14 crc kubenswrapper[4658]: I1002 11:35:14.999806 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:35:14 crc kubenswrapper[4658]: I1002 11:35:14.999850 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 02 11:35:15 crc kubenswrapper[4658]: I1002 11:35:15.960847 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadb9a0f-9e17-4171-af8c-c47fae2db1a1" path="/var/lib/kubelet/pods/dadb9a0f-9e17-4171-af8c-c47fae2db1a1/volumes" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.010325 4658 generic.go:334] "Generic (PLEG): container finished" podID="590179b8-356d-4392-bab5-037103481383" containerID="bb608a940d1d9a1235c40ea8fe1309cbb2481466c7de1dd6d29d26c5eedcac1d" exitCode=0 Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.010542 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"590179b8-356d-4392-bab5-037103481383","Type":"ContainerDied","Data":"bb608a940d1d9a1235c40ea8fe1309cbb2481466c7de1dd6d29d26c5eedcac1d"} Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.058703 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.058970 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.319761 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:35:16 crc kubenswrapper[4658]: E1002 11:35:16.320497 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadb9a0f-9e17-4171-af8c-c47fae2db1a1" containerName="init" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.320514 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadb9a0f-9e17-4171-af8c-c47fae2db1a1" containerName="init" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.320712 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadb9a0f-9e17-4171-af8c-c47fae2db1a1" containerName="init" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.321725 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.325040 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8hrw7" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.327781 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.328128 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.328322 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.348727 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.463049 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57cd238e-33f1-4536-bcf1-1ca7e57a141a-scripts\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.463146 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57cd238e-33f1-4536-bcf1-1ca7e57a141a-config\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.463258 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd238e-33f1-4536-bcf1-1ca7e57a141a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.463339 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8spkp\" (UniqueName: \"kubernetes.io/projected/57cd238e-33f1-4536-bcf1-1ca7e57a141a-kube-api-access-8spkp\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.463401 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd238e-33f1-4536-bcf1-1ca7e57a141a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.463504 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/57cd238e-33f1-4536-bcf1-1ca7e57a141a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.463581 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57cd238e-33f1-4536-bcf1-1ca7e57a141a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.564712 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57cd238e-33f1-4536-bcf1-1ca7e57a141a-scripts\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.564757 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57cd238e-33f1-4536-bcf1-1ca7e57a141a-config\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.564820 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd238e-33f1-4536-bcf1-1ca7e57a141a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.564855 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8spkp\" (UniqueName: \"kubernetes.io/projected/57cd238e-33f1-4536-bcf1-1ca7e57a141a-kube-api-access-8spkp\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.564892 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd238e-33f1-4536-bcf1-1ca7e57a141a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.564933 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/57cd238e-33f1-4536-bcf1-1ca7e57a141a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.564954 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57cd238e-33f1-4536-bcf1-1ca7e57a141a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.565572 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/57cd238e-33f1-4536-bcf1-1ca7e57a141a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.565976 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57cd238e-33f1-4536-bcf1-1ca7e57a141a-scripts\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.566746 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57cd238e-33f1-4536-bcf1-1ca7e57a141a-config\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.571421 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd238e-33f1-4536-bcf1-1ca7e57a141a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.571532 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57cd238e-33f1-4536-bcf1-1ca7e57a141a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.572088 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd238e-33f1-4536-bcf1-1ca7e57a141a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.586178 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8spkp\" (UniqueName: \"kubernetes.io/projected/57cd238e-33f1-4536-bcf1-1ca7e57a141a-kube-api-access-8spkp\") pod \"ovn-northd-0\" (UID: \"57cd238e-33f1-4536-bcf1-1ca7e57a141a\") " pod="openstack/ovn-northd-0" Oct 02 11:35:16 crc kubenswrapper[4658]: I1002 11:35:16.640707 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:35:18 crc kubenswrapper[4658]: I1002 11:35:18.396861 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:18 crc kubenswrapper[4658]: E1002 11:35:18.397059 4658 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:35:18 crc kubenswrapper[4658]: E1002 11:35:18.397287 4658 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:35:18 crc kubenswrapper[4658]: E1002 11:35:18.397400 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift podName:6d0e9bcc-e466-4017-92b9-d12e55fc7953 nodeName:}" failed. No retries permitted until 2025-10-02 11:35:26.39735993 +0000 UTC m=+1007.288513507 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift") pod "swift-storage-0" (UID: "6d0e9bcc-e466-4017-92b9-d12e55fc7953") : configmap "swift-ring-files" not found Oct 02 11:35:19 crc kubenswrapper[4658]: I1002 11:35:19.037921 4658 generic.go:334] "Generic (PLEG): container finished" podID="f4544e55-087c-4095-be50-820df44e0a48" containerID="b1679924fa14ef08b2595b7568d88e7f15b09384631dbf3c1591288012ee5b6d" exitCode=0 Oct 02 11:35:19 crc kubenswrapper[4658]: I1002 11:35:19.038050 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerDied","Data":"b1679924fa14ef08b2595b7568d88e7f15b09384631dbf3c1591288012ee5b6d"} Oct 02 11:35:19 crc kubenswrapper[4658]: I1002 11:35:19.041330 4658 generic.go:334] "Generic (PLEG): container finished" podID="ecaec123-d0cf-493f-bee4-b32cd4f084bf" containerID="e720ad13b45f3485d88f673a774cff628dd7c2f6d490976ec4eaed02992410e7" exitCode=0 Oct 02 11:35:19 crc kubenswrapper[4658]: I1002 11:35:19.041361 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ecaec123-d0cf-493f-bee4-b32cd4f084bf","Type":"ContainerDied","Data":"e720ad13b45f3485d88f673a774cff628dd7c2f6d490976ec4eaed02992410e7"} Oct 02 11:35:19 crc kubenswrapper[4658]: I1002 11:35:19.281375 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 11:35:19 crc kubenswrapper[4658]: I1002 11:35:19.352478 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:35:19 crc kubenswrapper[4658]: W1002 11:35:19.357831 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57cd238e_33f1_4536_bcf1_1ca7e57a141a.slice/crio-fe432428b41dbd3b27af2b0f3d521b54cb3ee466ccaf1ed3b97bbd41217e5473 WatchSource:0}: Error finding container fe432428b41dbd3b27af2b0f3d521b54cb3ee466ccaf1ed3b97bbd41217e5473: Status 404 returned error can't find the container with id fe432428b41dbd3b27af2b0f3d521b54cb3ee466ccaf1ed3b97bbd41217e5473 Oct 02 11:35:19 crc kubenswrapper[4658]: I1002 11:35:19.643571 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:35:19 crc kubenswrapper[4658]: I1002 11:35:19.708898 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-96xvt"] Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.060984 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"590179b8-356d-4392-bab5-037103481383","Type":"ContainerStarted","Data":"a0f3fde928b7493809e71fee6962068b34f615316c48640e585e33a97ce2a954"} Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.062610 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fkzqr" event={"ID":"0909c66f-f3c6-440c-add2-8784d1c209c7","Type":"ContainerStarted","Data":"e9a75a1fd8e620c094592bc740130f6c28cad485ffadecdb9cd715a43d1c635e"} Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.064164 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" event={"ID":"b5acc72c-89e2-455e-ada7-aa71d35b1c20","Type":"ContainerStarted","Data":"e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7"} Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.064238 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.064234 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" podUID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" containerName="dnsmasq-dns" containerID="cri-o://e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7" gracePeriod=10 Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.068359 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ecaec123-d0cf-493f-bee4-b32cd4f084bf","Type":"ContainerStarted","Data":"41717e7baf75626d98e987890f0aacb596aa86839d5667354885cb31659009de"} Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.070921 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"57cd238e-33f1-4536-bcf1-1ca7e57a141a","Type":"ContainerStarted","Data":"fe432428b41dbd3b27af2b0f3d521b54cb3ee466ccaf1ed3b97bbd41217e5473"} Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.084827 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.257689381 podStartE2EDuration="35.084810496s" podCreationTimestamp="2025-10-02 11:34:45 +0000 UTC" firstStartedPulling="2025-10-02 11:34:59.789496853 +0000 UTC m=+980.680650420" lastFinishedPulling="2025-10-02 11:35:08.616617968 +0000 UTC m=+989.507771535" observedRunningTime="2025-10-02 11:35:20.079972752 +0000 UTC m=+1000.971126319" watchObservedRunningTime="2025-10-02 11:35:20.084810496 +0000 UTC m=+1000.975964063" Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.108211 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.780560653 podStartE2EDuration="35.108193059s" podCreationTimestamp="2025-10-02 11:34:45 +0000 UTC" firstStartedPulling="2025-10-02 11:34:59.856683859 +0000 UTC m=+980.747837426" lastFinishedPulling="2025-10-02 11:35:08.184316265 +0000 UTC m=+989.075469832" observedRunningTime="2025-10-02 11:35:20.105635418 +0000 UTC m=+1000.996789005" watchObservedRunningTime="2025-10-02 11:35:20.108193059 +0000 UTC m=+1000.999346626" Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.127009 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" podStartSLOduration=14.126981286 podStartE2EDuration="14.126981286s" podCreationTimestamp="2025-10-02 11:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:35:20.124822888 +0000 UTC m=+1001.015976495" watchObservedRunningTime="2025-10-02 11:35:20.126981286 +0000 UTC m=+1001.018134853" Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.166368 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-fkzqr" podStartSLOduration=3.135067004 podStartE2EDuration="10.166336058s" podCreationTimestamp="2025-10-02 11:35:10 +0000 UTC" firstStartedPulling="2025-10-02 11:35:11.990778546 +0000 UTC m=+992.881932113" lastFinishedPulling="2025-10-02 11:35:19.0220476 +0000 UTC m=+999.913201167" observedRunningTime="2025-10-02 11:35:20.151472205 +0000 UTC m=+1001.042625772" watchObservedRunningTime="2025-10-02 11:35:20.166336058 +0000 UTC m=+1001.057489625" Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.792414 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.974808 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-dns-svc\") pod \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.975364 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-config\") pod \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.975445 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-nb\") pod \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.975511 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-sb\") pod \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.975620 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g762m\" (UniqueName: \"kubernetes.io/projected/b5acc72c-89e2-455e-ada7-aa71d35b1c20-kube-api-access-g762m\") pod \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\" (UID: \"b5acc72c-89e2-455e-ada7-aa71d35b1c20\") " Oct 02 11:35:20 crc kubenswrapper[4658]: I1002 11:35:20.984687 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5acc72c-89e2-455e-ada7-aa71d35b1c20-kube-api-access-g762m" (OuterVolumeSpecName: "kube-api-access-g762m") pod "b5acc72c-89e2-455e-ada7-aa71d35b1c20" (UID: "b5acc72c-89e2-455e-ada7-aa71d35b1c20"). InnerVolumeSpecName "kube-api-access-g762m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.029106 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-config" (OuterVolumeSpecName: "config") pod "b5acc72c-89e2-455e-ada7-aa71d35b1c20" (UID: "b5acc72c-89e2-455e-ada7-aa71d35b1c20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.029961 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5acc72c-89e2-455e-ada7-aa71d35b1c20" (UID: "b5acc72c-89e2-455e-ada7-aa71d35b1c20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.034358 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5acc72c-89e2-455e-ada7-aa71d35b1c20" (UID: "b5acc72c-89e2-455e-ada7-aa71d35b1c20"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.037283 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5acc72c-89e2-455e-ada7-aa71d35b1c20" (UID: "b5acc72c-89e2-455e-ada7-aa71d35b1c20"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.078238 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g762m\" (UniqueName: \"kubernetes.io/projected/b5acc72c-89e2-455e-ada7-aa71d35b1c20-kube-api-access-g762m\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.078281 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.078310 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.078324 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.078335 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5acc72c-89e2-455e-ada7-aa71d35b1c20-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.101126 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"57cd238e-33f1-4536-bcf1-1ca7e57a141a","Type":"ContainerStarted","Data":"35499a3e542e80599ab47c6f72c531a2c1f3788bc7ce30e03b47c62e617ac7ca"} Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.103068 4658 generic.go:334] "Generic (PLEG): container finished" podID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" containerID="e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7" exitCode=0 Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.105075 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.107428 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" event={"ID":"b5acc72c-89e2-455e-ada7-aa71d35b1c20","Type":"ContainerDied","Data":"e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7"} Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.107829 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-96xvt" event={"ID":"b5acc72c-89e2-455e-ada7-aa71d35b1c20","Type":"ContainerDied","Data":"0bd1eafed9337e6a4ecff8f1cb3280359be115aed0dfc4188584319b97169b0b"} Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.107859 4658 scope.go:117] "RemoveContainer" containerID="e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.125790 4658 scope.go:117] "RemoveContainer" containerID="8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.150360 4658 scope.go:117] "RemoveContainer" containerID="e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7" Oct 02 11:35:21 crc kubenswrapper[4658]: E1002 11:35:21.150833 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7\": container with ID starting with e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7 not found: ID does not exist" containerID="e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.150879 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7"} err="failed to get container status \"e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7\": rpc error: code = NotFound desc = could not find container \"e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7\": container with ID starting with e0537a7f3e608c24a2670a89e07123c712fa56d1198c3943d357c7ef5be916d7 not found: ID does not exist" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.150911 4658 scope.go:117] "RemoveContainer" containerID="8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2" Oct 02 11:35:21 crc kubenswrapper[4658]: E1002 11:35:21.151339 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2\": container with ID starting with 8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2 not found: ID does not exist" containerID="8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.151398 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2"} err="failed to get container status \"8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2\": rpc error: code = NotFound desc = could not find container \"8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2\": container with ID starting with 8935a85b315926605217f2bc72e32c4af73ca24b46228c972ce8e1734b32a3f2 not found: ID does not exist" Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.199464 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-96xvt"] Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.203513 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-96xvt"] Oct 02 11:35:21 crc kubenswrapper[4658]: I1002 11:35:21.961833 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" path="/var/lib/kubelet/pods/b5acc72c-89e2-455e-ada7-aa71d35b1c20/volumes" Oct 02 11:35:22 crc kubenswrapper[4658]: I1002 11:35:22.112681 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"57cd238e-33f1-4536-bcf1-1ca7e57a141a","Type":"ContainerStarted","Data":"b9410646eccdc317e885fd7c8b7215b75f1c9ea313c82c8f605ca52afa1ed9db"} Oct 02 11:35:22 crc kubenswrapper[4658]: I1002 11:35:22.112931 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 02 11:35:22 crc kubenswrapper[4658]: I1002 11:35:22.134053 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.801665155 podStartE2EDuration="6.134033373s" podCreationTimestamp="2025-10-02 11:35:16 +0000 UTC" firstStartedPulling="2025-10-02 11:35:19.360807509 +0000 UTC m=+1000.251961076" lastFinishedPulling="2025-10-02 11:35:20.693175737 +0000 UTC m=+1001.584329294" observedRunningTime="2025-10-02 11:35:22.133472275 +0000 UTC m=+1003.024625852" watchObservedRunningTime="2025-10-02 11:35:22.134033373 +0000 UTC m=+1003.025186940" Oct 02 11:35:26 crc kubenswrapper[4658]: I1002 11:35:26.474039 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:26 crc kubenswrapper[4658]: E1002 11:35:26.474236 4658 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:35:26 crc kubenswrapper[4658]: E1002 11:35:26.474702 4658 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:35:26 crc kubenswrapper[4658]: E1002 11:35:26.474768 4658 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift podName:6d0e9bcc-e466-4017-92b9-d12e55fc7953 nodeName:}" failed. No retries permitted until 2025-10-02 11:35:42.474737558 +0000 UTC m=+1023.365891125 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift") pod "swift-storage-0" (UID: "6d0e9bcc-e466-4017-92b9-d12e55fc7953") : configmap "swift-ring-files" not found Oct 02 11:35:26 crc kubenswrapper[4658]: I1002 11:35:26.556653 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 02 11:35:26 crc kubenswrapper[4658]: I1002 11:35:26.556738 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 02 11:35:26 crc kubenswrapper[4658]: I1002 11:35:26.703780 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 02 11:35:26 crc kubenswrapper[4658]: I1002 11:35:26.703848 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 02 11:35:31 crc kubenswrapper[4658]: I1002 11:35:31.713472 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 02 11:35:32 crc kubenswrapper[4658]: E1002 11:35:32.816420 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00" Oct 02 11:35:32 crc kubenswrapper[4658]: E1002 11:35:32.816986 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00,Command:[],Args:[--web.console.templates=/etc/prometheus/consoles --web.console.libraries=/etc/prometheus/console_libraries --config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.enable-remote-write-receiver --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h6mbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(f4544e55-087c-4095-be50-820df44e0a48): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:35:33 crc kubenswrapper[4658]: I1002 11:35:33.137668 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 02 11:35:33 crc kubenswrapper[4658]: I1002 11:35:33.190652 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 02 11:35:33 crc kubenswrapper[4658]: I1002 11:35:33.229697 4658 generic.go:334] "Generic (PLEG): container finished" podID="0909c66f-f3c6-440c-add2-8784d1c209c7" containerID="e9a75a1fd8e620c094592bc740130f6c28cad485ffadecdb9cd715a43d1c635e" exitCode=0 Oct 02 11:35:33 crc kubenswrapper[4658]: I1002 11:35:33.229767 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fkzqr" event={"ID":"0909c66f-f3c6-440c-add2-8784d1c209c7","Type":"ContainerDied","Data":"e9a75a1fd8e620c094592bc740130f6c28cad485ffadecdb9cd715a43d1c635e"} Oct 02 11:35:33 crc kubenswrapper[4658]: I1002 11:35:33.231739 4658 generic.go:334] "Generic (PLEG): container finished" podID="4cc6649a-7a89-4658-9a2d-a09cb4f5f860" containerID="8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73" exitCode=0 Oct 02 11:35:33 crc kubenswrapper[4658]: I1002 11:35:33.232399 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cc6649a-7a89-4658-9a2d-a09cb4f5f860","Type":"ContainerDied","Data":"8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73"} Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.241894 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cc6649a-7a89-4658-9a2d-a09cb4f5f860","Type":"ContainerStarted","Data":"08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6"} Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.242236 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.275177 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.422804021 podStartE2EDuration="52.275156311s" podCreationTimestamp="2025-10-02 11:34:42 +0000 UTC" firstStartedPulling="2025-10-02 11:34:47.837139883 +0000 UTC m=+968.728293450" lastFinishedPulling="2025-10-02 11:34:58.689492173 +0000 UTC m=+979.580645740" observedRunningTime="2025-10-02 11:35:34.269409458 +0000 UTC m=+1015.160563035" watchObservedRunningTime="2025-10-02 11:35:34.275156311 +0000 UTC m=+1015.166309868" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.731266 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.800699 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.813496 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0909c66f-f3c6-440c-add2-8784d1c209c7-etc-swift\") pod \"0909c66f-f3c6-440c-add2-8784d1c209c7\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.813583 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-scripts\") pod \"0909c66f-f3c6-440c-add2-8784d1c209c7\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.813658 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5ddw\" (UniqueName: \"kubernetes.io/projected/0909c66f-f3c6-440c-add2-8784d1c209c7-kube-api-access-t5ddw\") pod \"0909c66f-f3c6-440c-add2-8784d1c209c7\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.813693 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-ring-data-devices\") pod \"0909c66f-f3c6-440c-add2-8784d1c209c7\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.813766 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-swiftconf\") pod \"0909c66f-f3c6-440c-add2-8784d1c209c7\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.813803 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-dispersionconf\") pod \"0909c66f-f3c6-440c-add2-8784d1c209c7\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.813909 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-combined-ca-bundle\") pod \"0909c66f-f3c6-440c-add2-8784d1c209c7\" (UID: \"0909c66f-f3c6-440c-add2-8784d1c209c7\") " Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.814379 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0909c66f-f3c6-440c-add2-8784d1c209c7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0909c66f-f3c6-440c-add2-8784d1c209c7" (UID: "0909c66f-f3c6-440c-add2-8784d1c209c7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.815318 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0909c66f-f3c6-440c-add2-8784d1c209c7" (UID: "0909c66f-f3c6-440c-add2-8784d1c209c7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.830909 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0909c66f-f3c6-440c-add2-8784d1c209c7" (UID: "0909c66f-f3c6-440c-add2-8784d1c209c7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.846046 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0909c66f-f3c6-440c-add2-8784d1c209c7-kube-api-access-t5ddw" (OuterVolumeSpecName: "kube-api-access-t5ddw") pod "0909c66f-f3c6-440c-add2-8784d1c209c7" (UID: "0909c66f-f3c6-440c-add2-8784d1c209c7"). InnerVolumeSpecName "kube-api-access-t5ddw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.850810 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0909c66f-f3c6-440c-add2-8784d1c209c7" (UID: "0909c66f-f3c6-440c-add2-8784d1c209c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.851100 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0909c66f-f3c6-440c-add2-8784d1c209c7" (UID: "0909c66f-f3c6-440c-add2-8784d1c209c7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.858697 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.876041 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-scripts" (OuterVolumeSpecName: "scripts") pod "0909c66f-f3c6-440c-add2-8784d1c209c7" (UID: "0909c66f-f3c6-440c-add2-8784d1c209c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.916466 4658 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0909c66f-f3c6-440c-add2-8784d1c209c7-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.916514 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.916527 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5ddw\" (UniqueName: \"kubernetes.io/projected/0909c66f-f3c6-440c-add2-8784d1c209c7-kube-api-access-t5ddw\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.916541 4658 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0909c66f-f3c6-440c-add2-8784d1c209c7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.916552 4658 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.916563 4658 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4658]: I1002 11:35:34.916573 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0909c66f-f3c6-440c-add2-8784d1c209c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:35 crc kubenswrapper[4658]: I1002 11:35:35.250196 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerStarted","Data":"7850102b9d52f56cfad95e272c498d00dedf5b4db2dbe19bbab57ec4b2866c53"} Oct 02 11:35:35 crc kubenswrapper[4658]: I1002 11:35:35.251849 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fkzqr" event={"ID":"0909c66f-f3c6-440c-add2-8784d1c209c7","Type":"ContainerDied","Data":"e4ef8b2a7f43f48d4d3b2f784abff760ac167efa8ce65d08d9df06fc5fa62d6e"} Oct 02 11:35:35 crc kubenswrapper[4658]: I1002 11:35:35.251865 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fkzqr" Oct 02 11:35:35 crc kubenswrapper[4658]: I1002 11:35:35.251878 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4ef8b2a7f43f48d4d3b2f784abff760ac167efa8ce65d08d9df06fc5fa62d6e" Oct 02 11:35:35 crc kubenswrapper[4658]: I1002 11:35:35.253180 4658 generic.go:334] "Generic (PLEG): container finished" podID="8aa01b90-7cce-4e10-ac37-57df39a56df1" containerID="962021eda53525352e51f7521305c62cf0b06e8762581492eb65c40a47f21d30" exitCode=0 Oct 02 11:35:35 crc kubenswrapper[4658]: I1002 11:35:35.253210 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8aa01b90-7cce-4e10-ac37-57df39a56df1","Type":"ContainerDied","Data":"962021eda53525352e51f7521305c62cf0b06e8762581492eb65c40a47f21d30"} Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.262915 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8aa01b90-7cce-4e10-ac37-57df39a56df1","Type":"ContainerStarted","Data":"d6da967d9b926334a1d04ebdc6f06a85006a898955b304def4658298bf259026"} Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.264133 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.293812 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.293778135 podStartE2EDuration="54.293778135s" podCreationTimestamp="2025-10-02 11:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:35:36.290040026 +0000 UTC m=+1017.181193603" watchObservedRunningTime="2025-10-02 11:35:36.293778135 +0000 UTC m=+1017.184931702" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.674603 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hqq9r"] Oct 02 11:35:36 crc kubenswrapper[4658]: E1002 11:35:36.675285 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" containerName="init" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.678588 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" containerName="init" Oct 02 11:35:36 crc kubenswrapper[4658]: E1002 11:35:36.679335 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0909c66f-f3c6-440c-add2-8784d1c209c7" containerName="swift-ring-rebalance" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.679467 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="0909c66f-f3c6-440c-add2-8784d1c209c7" containerName="swift-ring-rebalance" Oct 02 11:35:36 crc kubenswrapper[4658]: E1002 11:35:36.679979 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" containerName="dnsmasq-dns" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.680008 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" containerName="dnsmasq-dns" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.680974 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5acc72c-89e2-455e-ada7-aa71d35b1c20" containerName="dnsmasq-dns" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.681022 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="0909c66f-f3c6-440c-add2-8784d1c209c7" containerName="swift-ring-rebalance" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.682941 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hqq9r" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.692739 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hqq9r"] Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.747333 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hltvj\" (UniqueName: \"kubernetes.io/projected/8d1907c4-bad7-43fe-8982-c4ea70df1a12-kube-api-access-hltvj\") pod \"keystone-db-create-hqq9r\" (UID: \"8d1907c4-bad7-43fe-8982-c4ea70df1a12\") " pod="openstack/keystone-db-create-hqq9r" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.849390 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hltvj\" (UniqueName: \"kubernetes.io/projected/8d1907c4-bad7-43fe-8982-c4ea70df1a12-kube-api-access-hltvj\") pod \"keystone-db-create-hqq9r\" (UID: \"8d1907c4-bad7-43fe-8982-c4ea70df1a12\") " pod="openstack/keystone-db-create-hqq9r" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.866323 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-dggdx"] Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.873835 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dggdx" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.892561 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hltvj\" (UniqueName: \"kubernetes.io/projected/8d1907c4-bad7-43fe-8982-c4ea70df1a12-kube-api-access-hltvj\") pod \"keystone-db-create-hqq9r\" (UID: \"8d1907c4-bad7-43fe-8982-c4ea70df1a12\") " pod="openstack/keystone-db-create-hqq9r" Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.895410 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dggdx"] Oct 02 11:35:36 crc kubenswrapper[4658]: I1002 11:35:36.950569 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6f2f\" (UniqueName: \"kubernetes.io/projected/36cebe9d-b6af-4e46-83ad-ddafab15aefb-kube-api-access-t6f2f\") pod \"placement-db-create-dggdx\" (UID: \"36cebe9d-b6af-4e46-83ad-ddafab15aefb\") " pod="openstack/placement-db-create-dggdx" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.007423 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hqq9r" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.054348 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6f2f\" (UniqueName: \"kubernetes.io/projected/36cebe9d-b6af-4e46-83ad-ddafab15aefb-kube-api-access-t6f2f\") pod \"placement-db-create-dggdx\" (UID: \"36cebe9d-b6af-4e46-83ad-ddafab15aefb\") " pod="openstack/placement-db-create-dggdx" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.089326 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-h8l2l"] Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.090428 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h8l2l" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.096556 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6f2f\" (UniqueName: \"kubernetes.io/projected/36cebe9d-b6af-4e46-83ad-ddafab15aefb-kube-api-access-t6f2f\") pod \"placement-db-create-dggdx\" (UID: \"36cebe9d-b6af-4e46-83ad-ddafab15aefb\") " pod="openstack/placement-db-create-dggdx" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.120351 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-h8l2l"] Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.156461 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmlml\" (UniqueName: \"kubernetes.io/projected/c65aa35f-e3ca-4da7-af49-56f8c1af3e0e-kube-api-access-cmlml\") pod \"glance-db-create-h8l2l\" (UID: \"c65aa35f-e3ca-4da7-af49-56f8c1af3e0e\") " pod="openstack/glance-db-create-h8l2l" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.190831 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dggdx" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.260311 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmlml\" (UniqueName: \"kubernetes.io/projected/c65aa35f-e3ca-4da7-af49-56f8c1af3e0e-kube-api-access-cmlml\") pod \"glance-db-create-h8l2l\" (UID: \"c65aa35f-e3ca-4da7-af49-56f8c1af3e0e\") " pod="openstack/glance-db-create-h8l2l" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.282627 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmlml\" (UniqueName: \"kubernetes.io/projected/c65aa35f-e3ca-4da7-af49-56f8c1af3e0e-kube-api-access-cmlml\") pod \"glance-db-create-h8l2l\" (UID: \"c65aa35f-e3ca-4da7-af49-56f8c1af3e0e\") " pod="openstack/glance-db-create-h8l2l" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.473980 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h8l2l" Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.534898 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hqq9r"] Oct 02 11:35:37 crc kubenswrapper[4658]: W1002 11:35:37.539644 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d1907c4_bad7_43fe_8982_c4ea70df1a12.slice/crio-3edcc8e3eb53e78869c709c719fc78c1ad47fa4a5673a800f564d3a1779eaa0e WatchSource:0}: Error finding container 3edcc8e3eb53e78869c709c719fc78c1ad47fa4a5673a800f564d3a1779eaa0e: Status 404 returned error can't find the container with id 3edcc8e3eb53e78869c709c719fc78c1ad47fa4a5673a800f564d3a1779eaa0e Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.667386 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dggdx"] Oct 02 11:35:37 crc kubenswrapper[4658]: W1002 11:35:37.682711 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36cebe9d_b6af_4e46_83ad_ddafab15aefb.slice/crio-6eaddf1fc048282dc16890b1592852e786f5d1f2e853b49bb4c293decda5dbdd WatchSource:0}: Error finding container 6eaddf1fc048282dc16890b1592852e786f5d1f2e853b49bb4c293decda5dbdd: Status 404 returned error can't find the container with id 6eaddf1fc048282dc16890b1592852e786f5d1f2e853b49bb4c293decda5dbdd Oct 02 11:35:37 crc kubenswrapper[4658]: I1002 11:35:37.937902 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-h8l2l"] Oct 02 11:35:38 crc kubenswrapper[4658]: I1002 11:35:38.287827 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dggdx" event={"ID":"36cebe9d-b6af-4e46-83ad-ddafab15aefb","Type":"ContainerStarted","Data":"c8a33a3288e8c0b84b45ccd48df105b38969ec319dba5613e33d13edeb538045"} Oct 02 11:35:38 crc kubenswrapper[4658]: I1002 11:35:38.288155 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dggdx" event={"ID":"36cebe9d-b6af-4e46-83ad-ddafab15aefb","Type":"ContainerStarted","Data":"6eaddf1fc048282dc16890b1592852e786f5d1f2e853b49bb4c293decda5dbdd"} Oct 02 11:35:38 crc kubenswrapper[4658]: I1002 11:35:38.292871 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hqq9r" event={"ID":"8d1907c4-bad7-43fe-8982-c4ea70df1a12","Type":"ContainerStarted","Data":"76ea77dc33c44501a56b24922634e9c587ba3159351f3eb5f52f83f91a9090f8"} Oct 02 11:35:38 crc kubenswrapper[4658]: I1002 11:35:38.293024 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hqq9r" event={"ID":"8d1907c4-bad7-43fe-8982-c4ea70df1a12","Type":"ContainerStarted","Data":"3edcc8e3eb53e78869c709c719fc78c1ad47fa4a5673a800f564d3a1779eaa0e"} Oct 02 11:35:38 crc kubenswrapper[4658]: I1002 11:35:38.308487 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-dggdx" podStartSLOduration=2.308467445 podStartE2EDuration="2.308467445s" podCreationTimestamp="2025-10-02 11:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:35:38.302180675 +0000 UTC m=+1019.193334242" watchObservedRunningTime="2025-10-02 11:35:38.308467445 +0000 UTC m=+1019.199621012" Oct 02 11:35:38 crc kubenswrapper[4658]: I1002 11:35:38.322092 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-hqq9r" podStartSLOduration=2.322072607 podStartE2EDuration="2.322072607s" podCreationTimestamp="2025-10-02 11:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:35:38.316662775 +0000 UTC m=+1019.207816352" watchObservedRunningTime="2025-10-02 11:35:38.322072607 +0000 UTC m=+1019.213226174" Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.309773 4658 generic.go:334] "Generic (PLEG): container finished" podID="8d1907c4-bad7-43fe-8982-c4ea70df1a12" containerID="76ea77dc33c44501a56b24922634e9c587ba3159351f3eb5f52f83f91a9090f8" exitCode=0 Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.309956 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hqq9r" event={"ID":"8d1907c4-bad7-43fe-8982-c4ea70df1a12","Type":"ContainerDied","Data":"76ea77dc33c44501a56b24922634e9c587ba3159351f3eb5f52f83f91a9090f8"} Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.312458 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h8l2l" event={"ID":"c65aa35f-e3ca-4da7-af49-56f8c1af3e0e","Type":"ContainerStarted","Data":"c170047dd7885f1e41bbad1040185dd77ef6e7bce20a6053a4da939d091f9c3a"} Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.314879 4658 generic.go:334] "Generic (PLEG): container finished" podID="36cebe9d-b6af-4e46-83ad-ddafab15aefb" containerID="c8a33a3288e8c0b84b45ccd48df105b38969ec319dba5613e33d13edeb538045" exitCode=0 Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.314920 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dggdx" event={"ID":"36cebe9d-b6af-4e46-83ad-ddafab15aefb","Type":"ContainerDied","Data":"c8a33a3288e8c0b84b45ccd48df105b38969ec319dba5613e33d13edeb538045"} Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.375654 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-mccmm"] Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.376957 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mccmm" Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.391783 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-mccmm"] Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.395795 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcjhh\" (UniqueName: \"kubernetes.io/projected/25a91ffa-90bd-4ca9-9441-fccbed461ced-kube-api-access-vcjhh\") pod \"watcher-db-create-mccmm\" (UID: \"25a91ffa-90bd-4ca9-9441-fccbed461ced\") " pod="openstack/watcher-db-create-mccmm" Oct 02 11:35:39 crc kubenswrapper[4658]: E1002 11:35:39.409331 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="f4544e55-087c-4095-be50-820df44e0a48" Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.497124 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcjhh\" (UniqueName: \"kubernetes.io/projected/25a91ffa-90bd-4ca9-9441-fccbed461ced-kube-api-access-vcjhh\") pod \"watcher-db-create-mccmm\" (UID: \"25a91ffa-90bd-4ca9-9441-fccbed461ced\") " pod="openstack/watcher-db-create-mccmm" Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.521521 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcjhh\" (UniqueName: \"kubernetes.io/projected/25a91ffa-90bd-4ca9-9441-fccbed461ced-kube-api-access-vcjhh\") pod \"watcher-db-create-mccmm\" (UID: \"25a91ffa-90bd-4ca9-9441-fccbed461ced\") " pod="openstack/watcher-db-create-mccmm" Oct 02 11:35:39 crc kubenswrapper[4658]: I1002 11:35:39.704412 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mccmm" Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.142973 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-mccmm"] Oct 02 11:35:40 crc kubenswrapper[4658]: W1002 11:35:40.157737 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25a91ffa_90bd_4ca9_9441_fccbed461ced.slice/crio-15e534a6b813765fb61cc067a6713672ab1ee1a2080ea9eaa18f1ef8d090bc5e WatchSource:0}: Error finding container 15e534a6b813765fb61cc067a6713672ab1ee1a2080ea9eaa18f1ef8d090bc5e: Status 404 returned error can't find the container with id 15e534a6b813765fb61cc067a6713672ab1ee1a2080ea9eaa18f1ef8d090bc5e Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.326138 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerStarted","Data":"5d254e9aed8fbb8e98c0cb6fa78fefc38da6ca376af8b0d540609edbf4aa86ae"} Oct 02 11:35:40 crc kubenswrapper[4658]: E1002 11:35:40.328487 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="f4544e55-087c-4095-be50-820df44e0a48" Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.332756 4658 generic.go:334] "Generic (PLEG): container finished" podID="c65aa35f-e3ca-4da7-af49-56f8c1af3e0e" containerID="1f72312c38b0d2a1dfdcf6b43921ffb38074ef2c54fcdbdf219a4af1f9eb1dbf" exitCode=0 Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.332929 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h8l2l" event={"ID":"c65aa35f-e3ca-4da7-af49-56f8c1af3e0e","Type":"ContainerDied","Data":"1f72312c38b0d2a1dfdcf6b43921ffb38074ef2c54fcdbdf219a4af1f9eb1dbf"} Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.334814 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mccmm" event={"ID":"25a91ffa-90bd-4ca9-9441-fccbed461ced","Type":"ContainerStarted","Data":"15e534a6b813765fb61cc067a6713672ab1ee1a2080ea9eaa18f1ef8d090bc5e"} Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.397470 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-mccmm" podStartSLOduration=1.397444936 podStartE2EDuration="1.397444936s" podCreationTimestamp="2025-10-02 11:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:35:40.388992717 +0000 UTC m=+1021.280146294" watchObservedRunningTime="2025-10-02 11:35:40.397444936 +0000 UTC m=+1021.288598503" Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.782825 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hqq9r" Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.788579 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dggdx" Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.925775 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hltvj\" (UniqueName: \"kubernetes.io/projected/8d1907c4-bad7-43fe-8982-c4ea70df1a12-kube-api-access-hltvj\") pod \"8d1907c4-bad7-43fe-8982-c4ea70df1a12\" (UID: \"8d1907c4-bad7-43fe-8982-c4ea70df1a12\") " Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.925965 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6f2f\" (UniqueName: \"kubernetes.io/projected/36cebe9d-b6af-4e46-83ad-ddafab15aefb-kube-api-access-t6f2f\") pod \"36cebe9d-b6af-4e46-83ad-ddafab15aefb\" (UID: \"36cebe9d-b6af-4e46-83ad-ddafab15aefb\") " Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.932524 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36cebe9d-b6af-4e46-83ad-ddafab15aefb-kube-api-access-t6f2f" (OuterVolumeSpecName: "kube-api-access-t6f2f") pod "36cebe9d-b6af-4e46-83ad-ddafab15aefb" (UID: "36cebe9d-b6af-4e46-83ad-ddafab15aefb"). InnerVolumeSpecName "kube-api-access-t6f2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:40 crc kubenswrapper[4658]: I1002 11:35:40.932582 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1907c4-bad7-43fe-8982-c4ea70df1a12-kube-api-access-hltvj" (OuterVolumeSpecName: "kube-api-access-hltvj") pod "8d1907c4-bad7-43fe-8982-c4ea70df1a12" (UID: "8d1907c4-bad7-43fe-8982-c4ea70df1a12"). InnerVolumeSpecName "kube-api-access-hltvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.028336 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6f2f\" (UniqueName: \"kubernetes.io/projected/36cebe9d-b6af-4e46-83ad-ddafab15aefb-kube-api-access-t6f2f\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.028389 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hltvj\" (UniqueName: \"kubernetes.io/projected/8d1907c4-bad7-43fe-8982-c4ea70df1a12-kube-api-access-hltvj\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.345562 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dggdx" event={"ID":"36cebe9d-b6af-4e46-83ad-ddafab15aefb","Type":"ContainerDied","Data":"6eaddf1fc048282dc16890b1592852e786f5d1f2e853b49bb4c293decda5dbdd"} Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.345590 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dggdx" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.345609 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eaddf1fc048282dc16890b1592852e786f5d1f2e853b49bb4c293decda5dbdd" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.347464 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hqq9r" event={"ID":"8d1907c4-bad7-43fe-8982-c4ea70df1a12","Type":"ContainerDied","Data":"3edcc8e3eb53e78869c709c719fc78c1ad47fa4a5673a800f564d3a1779eaa0e"} Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.347502 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3edcc8e3eb53e78869c709c719fc78c1ad47fa4a5673a800f564d3a1779eaa0e" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.347971 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hqq9r" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.349502 4658 generic.go:334] "Generic (PLEG): container finished" podID="25a91ffa-90bd-4ca9-9441-fccbed461ced" containerID="227fe2f696a1aecbd575f7868fc7a7d728895e8dac4c85d318a49efe412cb590" exitCode=0 Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.349574 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mccmm" event={"ID":"25a91ffa-90bd-4ca9-9441-fccbed461ced","Type":"ContainerDied","Data":"227fe2f696a1aecbd575f7868fc7a7d728895e8dac4c85d318a49efe412cb590"} Oct 02 11:35:41 crc kubenswrapper[4658]: E1002 11:35:41.351563 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="f4544e55-087c-4095-be50-820df44e0a48" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.669568 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h8l2l" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.840239 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmlml\" (UniqueName: \"kubernetes.io/projected/c65aa35f-e3ca-4da7-af49-56f8c1af3e0e-kube-api-access-cmlml\") pod \"c65aa35f-e3ca-4da7-af49-56f8c1af3e0e\" (UID: \"c65aa35f-e3ca-4da7-af49-56f8c1af3e0e\") " Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.852601 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65aa35f-e3ca-4da7-af49-56f8c1af3e0e-kube-api-access-cmlml" (OuterVolumeSpecName: "kube-api-access-cmlml") pod "c65aa35f-e3ca-4da7-af49-56f8c1af3e0e" (UID: "c65aa35f-e3ca-4da7-af49-56f8c1af3e0e"). InnerVolumeSpecName "kube-api-access-cmlml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:41 crc kubenswrapper[4658]: I1002 11:35:41.942523 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmlml\" (UniqueName: \"kubernetes.io/projected/c65aa35f-e3ca-4da7-af49-56f8c1af3e0e-kube-api-access-cmlml\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.359681 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h8l2l" event={"ID":"c65aa35f-e3ca-4da7-af49-56f8c1af3e0e","Type":"ContainerDied","Data":"c170047dd7885f1e41bbad1040185dd77ef6e7bce20a6053a4da939d091f9c3a"} Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.359743 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c170047dd7885f1e41bbad1040185dd77ef6e7bce20a6053a4da939d091f9c3a" Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.359761 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h8l2l" Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.529944 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-h2htr" podUID="ed2f1df6-db7a-483e-a80d-298f12a389c8" containerName="ovn-controller" probeResult="failure" output=< Oct 02 11:35:42 crc kubenswrapper[4658]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 11:35:42 crc kubenswrapper[4658]: > Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.554121 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.561842 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d0e9bcc-e466-4017-92b9-d12e55fc7953-etc-swift\") pod \"swift-storage-0\" (UID: \"6d0e9bcc-e466-4017-92b9-d12e55fc7953\") " pod="openstack/swift-storage-0" Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.722833 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mccmm" Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.784162 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.859004 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcjhh\" (UniqueName: \"kubernetes.io/projected/25a91ffa-90bd-4ca9-9441-fccbed461ced-kube-api-access-vcjhh\") pod \"25a91ffa-90bd-4ca9-9441-fccbed461ced\" (UID: \"25a91ffa-90bd-4ca9-9441-fccbed461ced\") " Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.868708 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a91ffa-90bd-4ca9-9441-fccbed461ced-kube-api-access-vcjhh" (OuterVolumeSpecName: "kube-api-access-vcjhh") pod "25a91ffa-90bd-4ca9-9441-fccbed461ced" (UID: "25a91ffa-90bd-4ca9-9441-fccbed461ced"). InnerVolumeSpecName "kube-api-access-vcjhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:42 crc kubenswrapper[4658]: I1002 11:35:42.961134 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcjhh\" (UniqueName: \"kubernetes.io/projected/25a91ffa-90bd-4ca9-9441-fccbed461ced-kube-api-access-vcjhh\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:43 crc kubenswrapper[4658]: I1002 11:35:43.332418 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:35:43 crc kubenswrapper[4658]: W1002 11:35:43.336515 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d0e9bcc_e466_4017_92b9_d12e55fc7953.slice/crio-a1017ea1817a5241dbb69b214b0b3b1102718a66e74e0990e8b2dc3b0ce9fad9 WatchSource:0}: Error finding container a1017ea1817a5241dbb69b214b0b3b1102718a66e74e0990e8b2dc3b0ce9fad9: Status 404 returned error can't find the container with id a1017ea1817a5241dbb69b214b0b3b1102718a66e74e0990e8b2dc3b0ce9fad9 Oct 02 11:35:43 crc kubenswrapper[4658]: I1002 11:35:43.380552 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"a1017ea1817a5241dbb69b214b0b3b1102718a66e74e0990e8b2dc3b0ce9fad9"} Oct 02 11:35:43 crc kubenswrapper[4658]: I1002 11:35:43.382085 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mccmm" event={"ID":"25a91ffa-90bd-4ca9-9441-fccbed461ced","Type":"ContainerDied","Data":"15e534a6b813765fb61cc067a6713672ab1ee1a2080ea9eaa18f1ef8d090bc5e"} Oct 02 11:35:43 crc kubenswrapper[4658]: I1002 11:35:43.382132 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15e534a6b813765fb61cc067a6713672ab1ee1a2080ea9eaa18f1ef8d090bc5e" Oct 02 11:35:43 crc kubenswrapper[4658]: I1002 11:35:43.382137 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mccmm" Oct 02 11:35:43 crc kubenswrapper[4658]: I1002 11:35:43.422019 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:35:45 crc kubenswrapper[4658]: I1002 11:35:45.407219 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"16cc94b7ac51f1861f26d959b823b972ccaba0db42b6a0ed1bec91c930adf111"} Oct 02 11:35:45 crc kubenswrapper[4658]: I1002 11:35:45.407603 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"763ac4f14342734014a1284b17e6e6799f55f2b33e3473cf485f15fe7b8805f0"} Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.418189 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"c1f7f5ad260ef8029ad7b7f96e48bc7788697ee173e2f05383549b48f6de34b1"} Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.418642 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"394442f905fdcb030249bcee2eb4527a4271456c4bd3ba97f6cac195c16da90f"} Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.627508 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6ffc-account-create-2fxc9"] Oct 02 11:35:46 crc kubenswrapper[4658]: E1002 11:35:46.627925 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1907c4-bad7-43fe-8982-c4ea70df1a12" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.627940 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1907c4-bad7-43fe-8982-c4ea70df1a12" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: E1002 11:35:46.627971 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36cebe9d-b6af-4e46-83ad-ddafab15aefb" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.627978 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="36cebe9d-b6af-4e46-83ad-ddafab15aefb" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: E1002 11:35:46.627999 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a91ffa-90bd-4ca9-9441-fccbed461ced" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.628009 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a91ffa-90bd-4ca9-9441-fccbed461ced" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: E1002 11:35:46.628021 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65aa35f-e3ca-4da7-af49-56f8c1af3e0e" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.628029 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65aa35f-e3ca-4da7-af49-56f8c1af3e0e" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.628215 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65aa35f-e3ca-4da7-af49-56f8c1af3e0e" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.628235 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="36cebe9d-b6af-4e46-83ad-ddafab15aefb" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.628253 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1907c4-bad7-43fe-8982-c4ea70df1a12" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.628264 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a91ffa-90bd-4ca9-9441-fccbed461ced" containerName="mariadb-database-create" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.628926 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6ffc-account-create-2fxc9" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.632052 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.648460 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6ffc-account-create-2fxc9"] Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.727944 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjw2x\" (UniqueName: \"kubernetes.io/projected/41e00451-8e3d-4e55-935c-7df7a71c261e-kube-api-access-cjw2x\") pod \"keystone-6ffc-account-create-2fxc9\" (UID: \"41e00451-8e3d-4e55-935c-7df7a71c261e\") " pod="openstack/keystone-6ffc-account-create-2fxc9" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.830258 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjw2x\" (UniqueName: \"kubernetes.io/projected/41e00451-8e3d-4e55-935c-7df7a71c261e-kube-api-access-cjw2x\") pod \"keystone-6ffc-account-create-2fxc9\" (UID: \"41e00451-8e3d-4e55-935c-7df7a71c261e\") " pod="openstack/keystone-6ffc-account-create-2fxc9" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.865331 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjw2x\" (UniqueName: \"kubernetes.io/projected/41e00451-8e3d-4e55-935c-7df7a71c261e-kube-api-access-cjw2x\") pod \"keystone-6ffc-account-create-2fxc9\" (UID: \"41e00451-8e3d-4e55-935c-7df7a71c261e\") " pod="openstack/keystone-6ffc-account-create-2fxc9" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.932488 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bbc1-account-create-47jjw"] Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.933610 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bbc1-account-create-47jjw" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.938432 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bbc1-account-create-47jjw"] Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.996613 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 02 11:35:46 crc kubenswrapper[4658]: I1002 11:35:46.997380 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6ffc-account-create-2fxc9" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.033127 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t965q\" (UniqueName: \"kubernetes.io/projected/9d84d533-3387-4b94-b519-c354db47dea0-kube-api-access-t965q\") pod \"placement-bbc1-account-create-47jjw\" (UID: \"9d84d533-3387-4b94-b519-c354db47dea0\") " pod="openstack/placement-bbc1-account-create-47jjw" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.135569 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t965q\" (UniqueName: \"kubernetes.io/projected/9d84d533-3387-4b94-b519-c354db47dea0-kube-api-access-t965q\") pod \"placement-bbc1-account-create-47jjw\" (UID: \"9d84d533-3387-4b94-b519-c354db47dea0\") " pod="openstack/placement-bbc1-account-create-47jjw" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.153637 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t965q\" (UniqueName: \"kubernetes.io/projected/9d84d533-3387-4b94-b519-c354db47dea0-kube-api-access-t965q\") pod \"placement-bbc1-account-create-47jjw\" (UID: \"9d84d533-3387-4b94-b519-c354db47dea0\") " pod="openstack/placement-bbc1-account-create-47jjw" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.240353 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-95c4-account-create-wfwww"] Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.241896 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95c4-account-create-wfwww" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.244184 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.248569 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-95c4-account-create-wfwww"] Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.317057 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bbc1-account-create-47jjw" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.338734 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xfcg\" (UniqueName: \"kubernetes.io/projected/e45d37dd-6bcd-4d3d-ab46-dabd8242a213-kube-api-access-4xfcg\") pod \"glance-95c4-account-create-wfwww\" (UID: \"e45d37dd-6bcd-4d3d-ab46-dabd8242a213\") " pod="openstack/glance-95c4-account-create-wfwww" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.444094 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xfcg\" (UniqueName: \"kubernetes.io/projected/e45d37dd-6bcd-4d3d-ab46-dabd8242a213-kube-api-access-4xfcg\") pod \"glance-95c4-account-create-wfwww\" (UID: \"e45d37dd-6bcd-4d3d-ab46-dabd8242a213\") " pod="openstack/glance-95c4-account-create-wfwww" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.466050 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xfcg\" (UniqueName: \"kubernetes.io/projected/e45d37dd-6bcd-4d3d-ab46-dabd8242a213-kube-api-access-4xfcg\") pod \"glance-95c4-account-create-wfwww\" (UID: \"e45d37dd-6bcd-4d3d-ab46-dabd8242a213\") " pod="openstack/glance-95c4-account-create-wfwww" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.538384 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-h2htr" podUID="ed2f1df6-db7a-483e-a80d-298f12a389c8" containerName="ovn-controller" probeResult="failure" output=< Oct 02 11:35:47 crc kubenswrapper[4658]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 11:35:47 crc kubenswrapper[4658]: > Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.554794 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.559999 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tbnj8" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.598125 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95c4-account-create-wfwww" Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.655263 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6ffc-account-create-2fxc9"] Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.880645 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bbc1-account-create-47jjw"] Oct 02 11:35:47 crc kubenswrapper[4658]: W1002 11:35:47.899015 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d84d533_3387_4b94_b519_c354db47dea0.slice/crio-9f2af6e91eed5790ce01d164197e43a19bf3112a700ae881ed6d0fb478253f9a WatchSource:0}: Error finding container 9f2af6e91eed5790ce01d164197e43a19bf3112a700ae881ed6d0fb478253f9a: Status 404 returned error can't find the container with id 9f2af6e91eed5790ce01d164197e43a19bf3112a700ae881ed6d0fb478253f9a Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.983141 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-h2htr-config-v5fft"] Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.993002 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h2htr-config-v5fft"] Oct 02 11:35:47 crc kubenswrapper[4658]: I1002 11:35:47.993139 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.000175 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.059042 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-additional-scripts\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.059128 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run-ovn\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.059473 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh4bf\" (UniqueName: \"kubernetes.io/projected/1adabbc0-e937-4363-9eb9-9e02844dd3da-kube-api-access-hh4bf\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.059574 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-log-ovn\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.059614 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.059687 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-scripts\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.161758 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-log-ovn\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.162195 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.162253 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-scripts\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.162284 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-additional-scripts\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.162344 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run-ovn\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.162517 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh4bf\" (UniqueName: \"kubernetes.io/projected/1adabbc0-e937-4363-9eb9-9e02844dd3da-kube-api-access-hh4bf\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.164934 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-additional-scripts\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.165268 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run-ovn\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.165267 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.165348 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-log-ovn\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.165593 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-scripts\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.202217 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh4bf\" (UniqueName: \"kubernetes.io/projected/1adabbc0-e937-4363-9eb9-9e02844dd3da-kube-api-access-hh4bf\") pod \"ovn-controller-h2htr-config-v5fft\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.310061 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-95c4-account-create-wfwww"] Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.351020 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.498055 4658 generic.go:334] "Generic (PLEG): container finished" podID="41e00451-8e3d-4e55-935c-7df7a71c261e" containerID="25789d8280b0feee6a663dba100a82e9f3725e5e1f99eca23a03ea3861044b67" exitCode=0 Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.498246 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6ffc-account-create-2fxc9" event={"ID":"41e00451-8e3d-4e55-935c-7df7a71c261e","Type":"ContainerDied","Data":"25789d8280b0feee6a663dba100a82e9f3725e5e1f99eca23a03ea3861044b67"} Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.498512 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6ffc-account-create-2fxc9" event={"ID":"41e00451-8e3d-4e55-935c-7df7a71c261e","Type":"ContainerStarted","Data":"8c5b145e04faafef58e9f931f4fd52a8fafd29522e6d7b86db0eb6caa801db55"} Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.506171 4658 generic.go:334] "Generic (PLEG): container finished" podID="9d84d533-3387-4b94-b519-c354db47dea0" containerID="8d6b19e7cf64f6333f6b7f64511c0ce3de509dd36acc8f9d4967e05a1ea47c59" exitCode=0 Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.506241 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bbc1-account-create-47jjw" event={"ID":"9d84d533-3387-4b94-b519-c354db47dea0","Type":"ContainerDied","Data":"8d6b19e7cf64f6333f6b7f64511c0ce3de509dd36acc8f9d4967e05a1ea47c59"} Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.506269 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bbc1-account-create-47jjw" event={"ID":"9d84d533-3387-4b94-b519-c354db47dea0","Type":"ContainerStarted","Data":"9f2af6e91eed5790ce01d164197e43a19bf3112a700ae881ed6d0fb478253f9a"} Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.519754 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95c4-account-create-wfwww" event={"ID":"e45d37dd-6bcd-4d3d-ab46-dabd8242a213","Type":"ContainerStarted","Data":"c6362eaa1227dd18e57f7a67515144668db54c70fca1582fa8a97e112944d25e"} Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.534335 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"fb58734f9e712ee118f414bfa4e58f05be96daa4df87aab21d615eb5c1de1f42"} Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.534403 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"e39726a67977383c7bda9eb1093305c046af1dd0bcd0e7cfa1142a6b05bd1268"} Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.534421 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"29d2f98a50b53f30cc54513b5648f89448049c193eccf492a954c624623b81a4"} Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.534482 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"2d37400d99fe812d25dbfbd0a15066ba54e2d73efafb8994bee5029abba5d286"} Oct 02 11:35:48 crc kubenswrapper[4658]: I1002 11:35:48.855379 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h2htr-config-v5fft"] Oct 02 11:35:48 crc kubenswrapper[4658]: W1002 11:35:48.859494 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1adabbc0_e937_4363_9eb9_9e02844dd3da.slice/crio-4d020f726e3e609993c2e7407b6a9b10858d02ddf25766c2dafe0e857b148e58 WatchSource:0}: Error finding container 4d020f726e3e609993c2e7407b6a9b10858d02ddf25766c2dafe0e857b148e58: Status 404 returned error can't find the container with id 4d020f726e3e609993c2e7407b6a9b10858d02ddf25766c2dafe0e857b148e58 Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.520188 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-5eba-account-create-4bnj5"] Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.523327 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-5eba-account-create-4bnj5" Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.526715 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.534785 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-5eba-account-create-4bnj5"] Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.548842 4658 generic.go:334] "Generic (PLEG): container finished" podID="e45d37dd-6bcd-4d3d-ab46-dabd8242a213" containerID="b2aa095548c3094b11dbc20535bc980660ad848e8d2106b9b0157da4febf98e2" exitCode=0 Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.548919 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95c4-account-create-wfwww" event={"ID":"e45d37dd-6bcd-4d3d-ab46-dabd8242a213","Type":"ContainerDied","Data":"b2aa095548c3094b11dbc20535bc980660ad848e8d2106b9b0157da4febf98e2"} Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.550669 4658 generic.go:334] "Generic (PLEG): container finished" podID="1adabbc0-e937-4363-9eb9-9e02844dd3da" containerID="bb8d1255886557c2f7bdfa62d0d469477dd7b6fb601d51a51307d11ed1e234a4" exitCode=0 Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.550725 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h2htr-config-v5fft" event={"ID":"1adabbc0-e937-4363-9eb9-9e02844dd3da","Type":"ContainerDied","Data":"bb8d1255886557c2f7bdfa62d0d469477dd7b6fb601d51a51307d11ed1e234a4"} Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.550782 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h2htr-config-v5fft" event={"ID":"1adabbc0-e937-4363-9eb9-9e02844dd3da","Type":"ContainerStarted","Data":"4d020f726e3e609993c2e7407b6a9b10858d02ddf25766c2dafe0e857b148e58"} Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.587161 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2dp2\" (UniqueName: \"kubernetes.io/projected/56dc1de8-4f80-4a7b-b461-6b5e830a889b-kube-api-access-c2dp2\") pod \"watcher-5eba-account-create-4bnj5\" (UID: \"56dc1de8-4f80-4a7b-b461-6b5e830a889b\") " pod="openstack/watcher-5eba-account-create-4bnj5" Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.690001 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2dp2\" (UniqueName: \"kubernetes.io/projected/56dc1de8-4f80-4a7b-b461-6b5e830a889b-kube-api-access-c2dp2\") pod \"watcher-5eba-account-create-4bnj5\" (UID: \"56dc1de8-4f80-4a7b-b461-6b5e830a889b\") " pod="openstack/watcher-5eba-account-create-4bnj5" Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.709712 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2dp2\" (UniqueName: \"kubernetes.io/projected/56dc1de8-4f80-4a7b-b461-6b5e830a889b-kube-api-access-c2dp2\") pod \"watcher-5eba-account-create-4bnj5\" (UID: \"56dc1de8-4f80-4a7b-b461-6b5e830a889b\") " pod="openstack/watcher-5eba-account-create-4bnj5" Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.841673 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-5eba-account-create-4bnj5" Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.968342 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bbc1-account-create-47jjw" Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.976589 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6ffc-account-create-2fxc9" Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.995360 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjw2x\" (UniqueName: \"kubernetes.io/projected/41e00451-8e3d-4e55-935c-7df7a71c261e-kube-api-access-cjw2x\") pod \"41e00451-8e3d-4e55-935c-7df7a71c261e\" (UID: \"41e00451-8e3d-4e55-935c-7df7a71c261e\") " Oct 02 11:35:49 crc kubenswrapper[4658]: I1002 11:35:49.995468 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t965q\" (UniqueName: \"kubernetes.io/projected/9d84d533-3387-4b94-b519-c354db47dea0-kube-api-access-t965q\") pod \"9d84d533-3387-4b94-b519-c354db47dea0\" (UID: \"9d84d533-3387-4b94-b519-c354db47dea0\") " Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.006574 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d84d533-3387-4b94-b519-c354db47dea0-kube-api-access-t965q" (OuterVolumeSpecName: "kube-api-access-t965q") pod "9d84d533-3387-4b94-b519-c354db47dea0" (UID: "9d84d533-3387-4b94-b519-c354db47dea0"). InnerVolumeSpecName "kube-api-access-t965q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.007861 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e00451-8e3d-4e55-935c-7df7a71c261e-kube-api-access-cjw2x" (OuterVolumeSpecName: "kube-api-access-cjw2x") pod "41e00451-8e3d-4e55-935c-7df7a71c261e" (UID: "41e00451-8e3d-4e55-935c-7df7a71c261e"). InnerVolumeSpecName "kube-api-access-cjw2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.097227 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjw2x\" (UniqueName: \"kubernetes.io/projected/41e00451-8e3d-4e55-935c-7df7a71c261e-kube-api-access-cjw2x\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.097266 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t965q\" (UniqueName: \"kubernetes.io/projected/9d84d533-3387-4b94-b519-c354db47dea0-kube-api-access-t965q\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.355227 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-5eba-account-create-4bnj5"] Oct 02 11:35:50 crc kubenswrapper[4658]: W1002 11:35:50.378329 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56dc1de8_4f80_4a7b_b461_6b5e830a889b.slice/crio-f926a8c0d7540017426fdd7ffb907ec31c23d8a44ab3bb26a49fa5cdc84c2449 WatchSource:0}: Error finding container f926a8c0d7540017426fdd7ffb907ec31c23d8a44ab3bb26a49fa5cdc84c2449: Status 404 returned error can't find the container with id f926a8c0d7540017426fdd7ffb907ec31c23d8a44ab3bb26a49fa5cdc84c2449 Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.562065 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-5eba-account-create-4bnj5" event={"ID":"56dc1de8-4f80-4a7b-b461-6b5e830a889b","Type":"ContainerStarted","Data":"f6f34af0f1cc13ac3cd33a10c18ee05641736727140c259693baba2af2659b52"} Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.562493 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-5eba-account-create-4bnj5" event={"ID":"56dc1de8-4f80-4a7b-b461-6b5e830a889b","Type":"ContainerStarted","Data":"f926a8c0d7540017426fdd7ffb907ec31c23d8a44ab3bb26a49fa5cdc84c2449"} Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.564747 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6ffc-account-create-2fxc9" event={"ID":"41e00451-8e3d-4e55-935c-7df7a71c261e","Type":"ContainerDied","Data":"8c5b145e04faafef58e9f931f4fd52a8fafd29522e6d7b86db0eb6caa801db55"} Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.564770 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c5b145e04faafef58e9f931f4fd52a8fafd29522e6d7b86db0eb6caa801db55" Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.564809 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6ffc-account-create-2fxc9" Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.567537 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bbc1-account-create-47jjw" event={"ID":"9d84d533-3387-4b94-b519-c354db47dea0","Type":"ContainerDied","Data":"9f2af6e91eed5790ce01d164197e43a19bf3112a700ae881ed6d0fb478253f9a"} Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.567586 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f2af6e91eed5790ce01d164197e43a19bf3112a700ae881ed6d0fb478253f9a" Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.567666 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bbc1-account-create-47jjw" Oct 02 11:35:50 crc kubenswrapper[4658]: I1002 11:35:50.580838 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-5eba-account-create-4bnj5" podStartSLOduration=1.5808138870000001 podStartE2EDuration="1.580813887s" podCreationTimestamp="2025-10-02 11:35:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:35:50.576242762 +0000 UTC m=+1031.467396319" watchObservedRunningTime="2025-10-02 11:35:50.580813887 +0000 UTC m=+1031.471967454" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.034509 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.044045 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95c4-account-create-wfwww" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.123627 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run-ovn\") pod \"1adabbc0-e937-4363-9eb9-9e02844dd3da\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.123676 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xfcg\" (UniqueName: \"kubernetes.io/projected/e45d37dd-6bcd-4d3d-ab46-dabd8242a213-kube-api-access-4xfcg\") pod \"e45d37dd-6bcd-4d3d-ab46-dabd8242a213\" (UID: \"e45d37dd-6bcd-4d3d-ab46-dabd8242a213\") " Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.123702 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-scripts\") pod \"1adabbc0-e937-4363-9eb9-9e02844dd3da\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.123769 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-additional-scripts\") pod \"1adabbc0-e937-4363-9eb9-9e02844dd3da\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.123857 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run\") pod \"1adabbc0-e937-4363-9eb9-9e02844dd3da\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.123936 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh4bf\" (UniqueName: \"kubernetes.io/projected/1adabbc0-e937-4363-9eb9-9e02844dd3da-kube-api-access-hh4bf\") pod \"1adabbc0-e937-4363-9eb9-9e02844dd3da\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.123940 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1adabbc0-e937-4363-9eb9-9e02844dd3da" (UID: "1adabbc0-e937-4363-9eb9-9e02844dd3da"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.123979 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1adabbc0-e937-4363-9eb9-9e02844dd3da" (UID: "1adabbc0-e937-4363-9eb9-9e02844dd3da"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.123960 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-log-ovn\") pod \"1adabbc0-e937-4363-9eb9-9e02844dd3da\" (UID: \"1adabbc0-e937-4363-9eb9-9e02844dd3da\") " Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.124006 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run" (OuterVolumeSpecName: "var-run") pod "1adabbc0-e937-4363-9eb9-9e02844dd3da" (UID: "1adabbc0-e937-4363-9eb9-9e02844dd3da"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.124733 4658 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.124761 4658 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.124775 4658 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1adabbc0-e937-4363-9eb9-9e02844dd3da-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.124997 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1adabbc0-e937-4363-9eb9-9e02844dd3da" (UID: "1adabbc0-e937-4363-9eb9-9e02844dd3da"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.125063 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-scripts" (OuterVolumeSpecName: "scripts") pod "1adabbc0-e937-4363-9eb9-9e02844dd3da" (UID: "1adabbc0-e937-4363-9eb9-9e02844dd3da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.129542 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1adabbc0-e937-4363-9eb9-9e02844dd3da-kube-api-access-hh4bf" (OuterVolumeSpecName: "kube-api-access-hh4bf") pod "1adabbc0-e937-4363-9eb9-9e02844dd3da" (UID: "1adabbc0-e937-4363-9eb9-9e02844dd3da"). InnerVolumeSpecName "kube-api-access-hh4bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.130134 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45d37dd-6bcd-4d3d-ab46-dabd8242a213-kube-api-access-4xfcg" (OuterVolumeSpecName: "kube-api-access-4xfcg") pod "e45d37dd-6bcd-4d3d-ab46-dabd8242a213" (UID: "e45d37dd-6bcd-4d3d-ab46-dabd8242a213"). InnerVolumeSpecName "kube-api-access-4xfcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.226127 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh4bf\" (UniqueName: \"kubernetes.io/projected/1adabbc0-e937-4363-9eb9-9e02844dd3da-kube-api-access-hh4bf\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.226160 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xfcg\" (UniqueName: \"kubernetes.io/projected/e45d37dd-6bcd-4d3d-ab46-dabd8242a213-kube-api-access-4xfcg\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.226171 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.226181 4658 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1adabbc0-e937-4363-9eb9-9e02844dd3da-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.577620 4658 generic.go:334] "Generic (PLEG): container finished" podID="56dc1de8-4f80-4a7b-b461-6b5e830a889b" containerID="f6f34af0f1cc13ac3cd33a10c18ee05641736727140c259693baba2af2659b52" exitCode=0 Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.577702 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-5eba-account-create-4bnj5" event={"ID":"56dc1de8-4f80-4a7b-b461-6b5e830a889b","Type":"ContainerDied","Data":"f6f34af0f1cc13ac3cd33a10c18ee05641736727140c259693baba2af2659b52"} Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.582822 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95c4-account-create-wfwww" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.582821 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95c4-account-create-wfwww" event={"ID":"e45d37dd-6bcd-4d3d-ab46-dabd8242a213","Type":"ContainerDied","Data":"c6362eaa1227dd18e57f7a67515144668db54c70fca1582fa8a97e112944d25e"} Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.582878 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6362eaa1227dd18e57f7a67515144668db54c70fca1582fa8a97e112944d25e" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.589839 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h2htr-config-v5fft" event={"ID":"1adabbc0-e937-4363-9eb9-9e02844dd3da","Type":"ContainerDied","Data":"4d020f726e3e609993c2e7407b6a9b10858d02ddf25766c2dafe0e857b148e58"} Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.589884 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d020f726e3e609993c2e7407b6a9b10858d02ddf25766c2dafe0e857b148e58" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.589958 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h2htr-config-v5fft" Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.594988 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"2ef1673c679b90c739879aaacb407a227cc93d1693ff9abe1324603d956e9548"} Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.595040 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"9b9c2863a899075303c41acad952df92c9bac0ef51046205210b066fbbaa3cb4"} Oct 02 11:35:51 crc kubenswrapper[4658]: I1002 11:35:51.595053 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"79f774c6a6df68801083086fddecf0a86854545bdd7d710934122308fc12b442"} Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.172250 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-h2htr-config-v5fft"] Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.183578 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-h2htr-config-v5fft"] Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.374956 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-kbvvc"] Oct 02 11:35:52 crc kubenswrapper[4658]: E1002 11:35:52.375277 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e00451-8e3d-4e55-935c-7df7a71c261e" containerName="mariadb-account-create" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.375311 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e00451-8e3d-4e55-935c-7df7a71c261e" containerName="mariadb-account-create" Oct 02 11:35:52 crc kubenswrapper[4658]: E1002 11:35:52.375341 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d84d533-3387-4b94-b519-c354db47dea0" containerName="mariadb-account-create" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.375348 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d84d533-3387-4b94-b519-c354db47dea0" containerName="mariadb-account-create" Oct 02 11:35:52 crc kubenswrapper[4658]: E1002 11:35:52.375363 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adabbc0-e937-4363-9eb9-9e02844dd3da" containerName="ovn-config" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.375370 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adabbc0-e937-4363-9eb9-9e02844dd3da" containerName="ovn-config" Oct 02 11:35:52 crc kubenswrapper[4658]: E1002 11:35:52.375390 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45d37dd-6bcd-4d3d-ab46-dabd8242a213" containerName="mariadb-account-create" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.375396 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45d37dd-6bcd-4d3d-ab46-dabd8242a213" containerName="mariadb-account-create" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.375543 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="1adabbc0-e937-4363-9eb9-9e02844dd3da" containerName="ovn-config" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.375553 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d84d533-3387-4b94-b519-c354db47dea0" containerName="mariadb-account-create" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.375567 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e00451-8e3d-4e55-935c-7df7a71c261e" containerName="mariadb-account-create" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.375580 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45d37dd-6bcd-4d3d-ab46-dabd8242a213" containerName="mariadb-account-create" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.376102 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.378014 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.378127 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t7vc6" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.397364 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kbvvc"] Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.447028 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-db-sync-config-data\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.447086 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-combined-ca-bundle\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.447133 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqzz7\" (UniqueName: \"kubernetes.io/projected/87a291e0-0291-4591-8d80-818338d6ae2d-kube-api-access-vqzz7\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.447183 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-config-data\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.514635 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-h2htr" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.549059 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-config-data\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.549252 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-db-sync-config-data\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.549334 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-combined-ca-bundle\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.549409 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqzz7\" (UniqueName: \"kubernetes.io/projected/87a291e0-0291-4591-8d80-818338d6ae2d-kube-api-access-vqzz7\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.555107 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-combined-ca-bundle\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.555531 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-config-data\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.556277 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-db-sync-config-data\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.569488 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqzz7\" (UniqueName: \"kubernetes.io/projected/87a291e0-0291-4591-8d80-818338d6ae2d-kube-api-access-vqzz7\") pod \"glance-db-sync-kbvvc\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.610324 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"1e1a4eac2f2ae3117c160289217fb099b856e8738892f12dd80cfadbafd021c1"} Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.610585 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"9171723e162828fb6f66fdb3b88879da17da74fff854fdb8e6759a743520b25d"} Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.610598 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"136feeed3a72f16aef323d250409b2166379c25dadfc025858afa114506cd5b2"} Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.610606 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d0e9bcc-e466-4017-92b9-d12e55fc7953","Type":"ContainerStarted","Data":"d2112a054e6729e094b60b2a3e7b1522bcb49a7cdd2b733efaf61fc221c32519"} Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.654453 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.154527779 podStartE2EDuration="43.65442811s" podCreationTimestamp="2025-10-02 11:35:09 +0000 UTC" firstStartedPulling="2025-10-02 11:35:43.338711872 +0000 UTC m=+1024.229865439" lastFinishedPulling="2025-10-02 11:35:50.838612193 +0000 UTC m=+1031.729765770" observedRunningTime="2025-10-02 11:35:52.647773219 +0000 UTC m=+1033.538926786" watchObservedRunningTime="2025-10-02 11:35:52.65442811 +0000 UTC m=+1033.545581687" Oct 02 11:35:52 crc kubenswrapper[4658]: I1002 11:35:52.691694 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kbvvc" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:52.994841 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-79xjk"] Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:52.997593 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-5eba-account-create-4bnj5" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:52.998164 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.000389 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.031362 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-79xjk"] Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.177230 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2dp2\" (UniqueName: \"kubernetes.io/projected/56dc1de8-4f80-4a7b-b461-6b5e830a889b-kube-api-access-c2dp2\") pod \"56dc1de8-4f80-4a7b-b461-6b5e830a889b\" (UID: \"56dc1de8-4f80-4a7b-b461-6b5e830a889b\") " Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.177502 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.177545 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-config\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.177570 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.177639 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjbx\" (UniqueName: \"kubernetes.io/projected/8145034e-a3f8-413f-b306-26f4b240bd85-kube-api-access-9sjbx\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.177700 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.177839 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.190604 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56dc1de8-4f80-4a7b-b461-6b5e830a889b-kube-api-access-c2dp2" (OuterVolumeSpecName: "kube-api-access-c2dp2") pod "56dc1de8-4f80-4a7b-b461-6b5e830a889b" (UID: "56dc1de8-4f80-4a7b-b461-6b5e830a889b"). InnerVolumeSpecName "kube-api-access-c2dp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.279882 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-config\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.279922 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.279984 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjbx\" (UniqueName: \"kubernetes.io/projected/8145034e-a3f8-413f-b306-26f4b240bd85-kube-api-access-9sjbx\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.280048 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.280099 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.280147 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.280203 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2dp2\" (UniqueName: \"kubernetes.io/projected/56dc1de8-4f80-4a7b-b461-6b5e830a889b-kube-api-access-c2dp2\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.281066 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-config\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.281168 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.281436 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.281438 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.281504 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.303365 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjbx\" (UniqueName: \"kubernetes.io/projected/8145034e-a3f8-413f-b306-26f4b240bd85-kube-api-access-9sjbx\") pod \"dnsmasq-dns-77585f5f8c-79xjk\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.334471 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.628539 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-5eba-account-create-4bnj5" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.630265 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-5eba-account-create-4bnj5" event={"ID":"56dc1de8-4f80-4a7b-b461-6b5e830a889b","Type":"ContainerDied","Data":"f926a8c0d7540017426fdd7ffb907ec31c23d8a44ab3bb26a49fa5cdc84c2449"} Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.630348 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f926a8c0d7540017426fdd7ffb907ec31c23d8a44ab3bb26a49fa5cdc84c2449" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.849504 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.936838 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-79xjk"] Oct 02 11:35:53 crc kubenswrapper[4658]: I1002 11:35:53.963224 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1adabbc0-e937-4363-9eb9-9e02844dd3da" path="/var/lib/kubelet/pods/1adabbc0-e937-4363-9eb9-9e02844dd3da/volumes" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.029874 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kbvvc"] Oct 02 11:35:54 crc kubenswrapper[4658]: W1002 11:35:54.038282 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87a291e0_0291_4591_8d80_818338d6ae2d.slice/crio-673530aac1ade61da3eea1822e295666130bc59d75920c46754e937a4f78eccd WatchSource:0}: Error finding container 673530aac1ade61da3eea1822e295666130bc59d75920c46754e937a4f78eccd: Status 404 returned error can't find the container with id 673530aac1ade61da3eea1822e295666130bc59d75920c46754e937a4f78eccd Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.193200 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hdzkf"] Oct 02 11:35:54 crc kubenswrapper[4658]: E1002 11:35:54.193649 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56dc1de8-4f80-4a7b-b461-6b5e830a889b" containerName="mariadb-account-create" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.193670 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="56dc1de8-4f80-4a7b-b461-6b5e830a889b" containerName="mariadb-account-create" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.193881 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="56dc1de8-4f80-4a7b-b461-6b5e830a889b" containerName="mariadb-account-create" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.194600 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hdzkf" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.250056 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hdzkf"] Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.308252 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsgcx\" (UniqueName: \"kubernetes.io/projected/65ba0196-c5f9-40ea-b43b-24c2d9e9ad60-kube-api-access-wsgcx\") pod \"cinder-db-create-hdzkf\" (UID: \"65ba0196-c5f9-40ea-b43b-24c2d9e9ad60\") " pod="openstack/cinder-db-create-hdzkf" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.325912 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-pd2cm"] Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.326990 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pd2cm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.358372 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pd2cm"] Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.411438 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsgcx\" (UniqueName: \"kubernetes.io/projected/65ba0196-c5f9-40ea-b43b-24c2d9e9ad60-kube-api-access-wsgcx\") pod \"cinder-db-create-hdzkf\" (UID: \"65ba0196-c5f9-40ea-b43b-24c2d9e9ad60\") " pod="openstack/cinder-db-create-hdzkf" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.411607 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4ph5\" (UniqueName: \"kubernetes.io/projected/7ba10cce-80f9-474b-aece-681f238af730-kube-api-access-h4ph5\") pod \"barbican-db-create-pd2cm\" (UID: \"7ba10cce-80f9-474b-aece-681f238af730\") " pod="openstack/barbican-db-create-pd2cm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.467030 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsgcx\" (UniqueName: \"kubernetes.io/projected/65ba0196-c5f9-40ea-b43b-24c2d9e9ad60-kube-api-access-wsgcx\") pod \"cinder-db-create-hdzkf\" (UID: \"65ba0196-c5f9-40ea-b43b-24c2d9e9ad60\") " pod="openstack/cinder-db-create-hdzkf" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.514350 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4ph5\" (UniqueName: \"kubernetes.io/projected/7ba10cce-80f9-474b-aece-681f238af730-kube-api-access-h4ph5\") pod \"barbican-db-create-pd2cm\" (UID: \"7ba10cce-80f9-474b-aece-681f238af730\") " pod="openstack/barbican-db-create-pd2cm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.519851 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hdzkf" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.533176 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4ph5\" (UniqueName: \"kubernetes.io/projected/7ba10cce-80f9-474b-aece-681f238af730-kube-api-access-h4ph5\") pod \"barbican-db-create-pd2cm\" (UID: \"7ba10cce-80f9-474b-aece-681f238af730\") " pod="openstack/barbican-db-create-pd2cm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.594753 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-g7rkz"] Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.595971 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g7rkz" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.608982 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g7rkz"] Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.663669 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pd2cm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.691768 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" event={"ID":"8145034e-a3f8-413f-b306-26f4b240bd85","Type":"ContainerStarted","Data":"41ec6a4251d87763fba7c1a1723bae95ffe66d7e0f5d5fb8907fd204db52f60f"} Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.716927 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfth\" (UniqueName: \"kubernetes.io/projected/721f38ad-db77-4b36-aa92-0c5ea5821709-kube-api-access-rnfth\") pod \"neutron-db-create-g7rkz\" (UID: \"721f38ad-db77-4b36-aa92-0c5ea5821709\") " pod="openstack/neutron-db-create-g7rkz" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.720620 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kbvvc" event={"ID":"87a291e0-0291-4591-8d80-818338d6ae2d","Type":"ContainerStarted","Data":"673530aac1ade61da3eea1822e295666130bc59d75920c46754e937a4f78eccd"} Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.726821 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-cgghm"] Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.727962 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.730316 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bltpc" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.732400 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.732579 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.732814 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.740759 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cgghm"] Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.818336 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-combined-ca-bundle\") pod \"keystone-db-sync-cgghm\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.818430 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4wrk\" (UniqueName: \"kubernetes.io/projected/1bc08b85-172e-4a85-8c1a-dc6c713737fd-kube-api-access-m4wrk\") pod \"keystone-db-sync-cgghm\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.818503 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-config-data\") pod \"keystone-db-sync-cgghm\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.818540 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfth\" (UniqueName: \"kubernetes.io/projected/721f38ad-db77-4b36-aa92-0c5ea5821709-kube-api-access-rnfth\") pod \"neutron-db-create-g7rkz\" (UID: \"721f38ad-db77-4b36-aa92-0c5ea5821709\") " pod="openstack/neutron-db-create-g7rkz" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.852521 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfth\" (UniqueName: \"kubernetes.io/projected/721f38ad-db77-4b36-aa92-0c5ea5821709-kube-api-access-rnfth\") pod \"neutron-db-create-g7rkz\" (UID: \"721f38ad-db77-4b36-aa92-0c5ea5821709\") " pod="openstack/neutron-db-create-g7rkz" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.854011 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-ml5sj"] Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.855398 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.861613 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-7wq4z" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.861903 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.867550 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-ml5sj"] Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.913260 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g7rkz" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.919973 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-combined-ca-bundle\") pod \"keystone-db-sync-cgghm\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.920022 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4wrk\" (UniqueName: \"kubernetes.io/projected/1bc08b85-172e-4a85-8c1a-dc6c713737fd-kube-api-access-m4wrk\") pod \"keystone-db-sync-cgghm\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.920072 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-config-data\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.920090 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-config-data\") pod \"keystone-db-sync-cgghm\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.920123 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwqhk\" (UniqueName: \"kubernetes.io/projected/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-kube-api-access-kwqhk\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.920151 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-combined-ca-bundle\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.920176 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-db-sync-config-data\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.926897 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-config-data\") pod \"keystone-db-sync-cgghm\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.926974 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-combined-ca-bundle\") pod \"keystone-db-sync-cgghm\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:54 crc kubenswrapper[4658]: I1002 11:35:54.936121 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4wrk\" (UniqueName: \"kubernetes.io/projected/1bc08b85-172e-4a85-8c1a-dc6c713737fd-kube-api-access-m4wrk\") pod \"keystone-db-sync-cgghm\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.021793 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-config-data\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.022152 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwqhk\" (UniqueName: \"kubernetes.io/projected/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-kube-api-access-kwqhk\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.022196 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-combined-ca-bundle\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.022236 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-db-sync-config-data\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.027990 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-db-sync-config-data\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.045960 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-combined-ca-bundle\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.046827 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-config-data\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.052484 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwqhk\" (UniqueName: \"kubernetes.io/projected/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-kube-api-access-kwqhk\") pod \"watcher-db-sync-ml5sj\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.102985 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cgghm" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.135874 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hdzkf"] Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.271475 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pd2cm"] Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.282462 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g7rkz"] Oct 02 11:35:55 crc kubenswrapper[4658]: W1002 11:35:55.311248 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod721f38ad_db77_4b36_aa92_0c5ea5821709.slice/crio-dd752f483cf419004a7dc6b85ac34d1b2aadc86fadd7b749cd4c04231ab3599d WatchSource:0}: Error finding container dd752f483cf419004a7dc6b85ac34d1b2aadc86fadd7b749cd4c04231ab3599d: Status 404 returned error can't find the container with id dd752f483cf419004a7dc6b85ac34d1b2aadc86fadd7b749cd4c04231ab3599d Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.312682 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.672336 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cgghm"] Oct 02 11:35:55 crc kubenswrapper[4658]: W1002 11:35:55.708191 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bc08b85_172e_4a85_8c1a_dc6c713737fd.slice/crio-842129441d60886525e432165eabdadc9921a4b45e495402c9cc7d46d7870475 WatchSource:0}: Error finding container 842129441d60886525e432165eabdadc9921a4b45e495402c9cc7d46d7870475: Status 404 returned error can't find the container with id 842129441d60886525e432165eabdadc9921a4b45e495402c9cc7d46d7870475 Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.735629 4658 generic.go:334] "Generic (PLEG): container finished" podID="65ba0196-c5f9-40ea-b43b-24c2d9e9ad60" containerID="afeb8dde19966c2f08a27ba588a4a3c8076e6262b0da3a788fee0aa1934f11f0" exitCode=0 Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.735778 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hdzkf" event={"ID":"65ba0196-c5f9-40ea-b43b-24c2d9e9ad60","Type":"ContainerDied","Data":"afeb8dde19966c2f08a27ba588a4a3c8076e6262b0da3a788fee0aa1934f11f0"} Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.735834 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hdzkf" event={"ID":"65ba0196-c5f9-40ea-b43b-24c2d9e9ad60","Type":"ContainerStarted","Data":"7a80d165c4cdafd37c12b6fd09044a0d660c565d3840324d24adf7f03240dede"} Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.738151 4658 generic.go:334] "Generic (PLEG): container finished" podID="721f38ad-db77-4b36-aa92-0c5ea5821709" containerID="c2e4fe080dad8831fce639647d70a7f9b9ece51b748726dcc1239e48a71df7c4" exitCode=0 Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.738198 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g7rkz" event={"ID":"721f38ad-db77-4b36-aa92-0c5ea5821709","Type":"ContainerDied","Data":"c2e4fe080dad8831fce639647d70a7f9b9ece51b748726dcc1239e48a71df7c4"} Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.738232 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g7rkz" event={"ID":"721f38ad-db77-4b36-aa92-0c5ea5821709","Type":"ContainerStarted","Data":"dd752f483cf419004a7dc6b85ac34d1b2aadc86fadd7b749cd4c04231ab3599d"} Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.752183 4658 generic.go:334] "Generic (PLEG): container finished" podID="8145034e-a3f8-413f-b306-26f4b240bd85" containerID="c955af8be2de8d584b337d64425bd2f2c67abe21965841e50a36c5782c1a4f14" exitCode=0 Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.752282 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" event={"ID":"8145034e-a3f8-413f-b306-26f4b240bd85","Type":"ContainerDied","Data":"c955af8be2de8d584b337d64425bd2f2c67abe21965841e50a36c5782c1a4f14"} Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.757234 4658 generic.go:334] "Generic (PLEG): container finished" podID="7ba10cce-80f9-474b-aece-681f238af730" containerID="e66c67cff56fbb8920a3618d65a9a75b4462f084982b05dc57a21f19ec9ed343" exitCode=0 Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.757359 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pd2cm" event={"ID":"7ba10cce-80f9-474b-aece-681f238af730","Type":"ContainerDied","Data":"e66c67cff56fbb8920a3618d65a9a75b4462f084982b05dc57a21f19ec9ed343"} Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.757734 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pd2cm" event={"ID":"7ba10cce-80f9-474b-aece-681f238af730","Type":"ContainerStarted","Data":"bec144b57304604b7dca5c892e50bd9f358206ecb77feee79de749972aed7019"} Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.759558 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cgghm" event={"ID":"1bc08b85-172e-4a85-8c1a-dc6c713737fd","Type":"ContainerStarted","Data":"842129441d60886525e432165eabdadc9921a4b45e495402c9cc7d46d7870475"} Oct 02 11:35:55 crc kubenswrapper[4658]: I1002 11:35:55.922104 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-ml5sj"] Oct 02 11:35:56 crc kubenswrapper[4658]: I1002 11:35:56.782986 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" event={"ID":"8145034e-a3f8-413f-b306-26f4b240bd85","Type":"ContainerStarted","Data":"410c6bd0ab23ea069efbd18fc4d6a500ef5ea2e6a88fb91ae89506b18ba894a8"} Oct 02 11:35:56 crc kubenswrapper[4658]: I1002 11:35:56.783376 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:35:56 crc kubenswrapper[4658]: I1002 11:35:56.785417 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-ml5sj" event={"ID":"efa1ebca-0cdd-4bce-adf2-e8273c3448f1","Type":"ContainerStarted","Data":"e151742b21a554e9b6ebe7aec369c327e35d35d87bb814e5635ffeb572d76313"} Oct 02 11:35:56 crc kubenswrapper[4658]: I1002 11:35:56.791430 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerStarted","Data":"fc3b2f4dfe83e80fe704616899693e37590bc6228277dd0553652f4e696fb474"} Oct 02 11:35:56 crc kubenswrapper[4658]: I1002 11:35:56.813472 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" podStartSLOduration=4.813454173 podStartE2EDuration="4.813454173s" podCreationTimestamp="2025-10-02 11:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:35:56.804953962 +0000 UTC m=+1037.696107519" watchObservedRunningTime="2025-10-02 11:35:56.813454173 +0000 UTC m=+1037.704607740" Oct 02 11:35:56 crc kubenswrapper[4658]: I1002 11:35:56.835238 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.850487804 podStartE2EDuration="1m7.835217609s" podCreationTimestamp="2025-10-02 11:34:49 +0000 UTC" firstStartedPulling="2025-10-02 11:34:59.86490015 +0000 UTC m=+980.756053717" lastFinishedPulling="2025-10-02 11:35:55.849629955 +0000 UTC m=+1036.740783522" observedRunningTime="2025-10-02 11:35:56.833254317 +0000 UTC m=+1037.724407884" watchObservedRunningTime="2025-10-02 11:35:56.835217609 +0000 UTC m=+1037.726371176" Oct 02 11:35:58 crc kubenswrapper[4658]: I1002 11:35:58.284131 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hdzkf" Oct 02 11:35:58 crc kubenswrapper[4658]: I1002 11:35:58.392777 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsgcx\" (UniqueName: \"kubernetes.io/projected/65ba0196-c5f9-40ea-b43b-24c2d9e9ad60-kube-api-access-wsgcx\") pod \"65ba0196-c5f9-40ea-b43b-24c2d9e9ad60\" (UID: \"65ba0196-c5f9-40ea-b43b-24c2d9e9ad60\") " Oct 02 11:35:58 crc kubenswrapper[4658]: I1002 11:35:58.409629 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ba0196-c5f9-40ea-b43b-24c2d9e9ad60-kube-api-access-wsgcx" (OuterVolumeSpecName: "kube-api-access-wsgcx") pod "65ba0196-c5f9-40ea-b43b-24c2d9e9ad60" (UID: "65ba0196-c5f9-40ea-b43b-24c2d9e9ad60"). InnerVolumeSpecName "kube-api-access-wsgcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:58 crc kubenswrapper[4658]: I1002 11:35:58.494705 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsgcx\" (UniqueName: \"kubernetes.io/projected/65ba0196-c5f9-40ea-b43b-24c2d9e9ad60-kube-api-access-wsgcx\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:58 crc kubenswrapper[4658]: I1002 11:35:58.817124 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hdzkf" event={"ID":"65ba0196-c5f9-40ea-b43b-24c2d9e9ad60","Type":"ContainerDied","Data":"7a80d165c4cdafd37c12b6fd09044a0d660c565d3840324d24adf7f03240dede"} Oct 02 11:35:58 crc kubenswrapper[4658]: I1002 11:35:58.817162 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a80d165c4cdafd37c12b6fd09044a0d660c565d3840324d24adf7f03240dede" Oct 02 11:35:58 crc kubenswrapper[4658]: I1002 11:35:58.817165 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hdzkf" Oct 02 11:36:00 crc kubenswrapper[4658]: I1002 11:36:00.932163 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:02 crc kubenswrapper[4658]: I1002 11:36:02.867753 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pd2cm" event={"ID":"7ba10cce-80f9-474b-aece-681f238af730","Type":"ContainerDied","Data":"bec144b57304604b7dca5c892e50bd9f358206ecb77feee79de749972aed7019"} Oct 02 11:36:02 crc kubenswrapper[4658]: I1002 11:36:02.868092 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bec144b57304604b7dca5c892e50bd9f358206ecb77feee79de749972aed7019" Oct 02 11:36:02 crc kubenswrapper[4658]: I1002 11:36:02.870537 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g7rkz" event={"ID":"721f38ad-db77-4b36-aa92-0c5ea5821709","Type":"ContainerDied","Data":"dd752f483cf419004a7dc6b85ac34d1b2aadc86fadd7b749cd4c04231ab3599d"} Oct 02 11:36:02 crc kubenswrapper[4658]: I1002 11:36:02.870591 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd752f483cf419004a7dc6b85ac34d1b2aadc86fadd7b749cd4c04231ab3599d" Oct 02 11:36:02 crc kubenswrapper[4658]: I1002 11:36:02.889105 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pd2cm" Oct 02 11:36:02 crc kubenswrapper[4658]: I1002 11:36:02.899982 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g7rkz" Oct 02 11:36:02 crc kubenswrapper[4658]: I1002 11:36:02.989239 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4ph5\" (UniqueName: \"kubernetes.io/projected/7ba10cce-80f9-474b-aece-681f238af730-kube-api-access-h4ph5\") pod \"7ba10cce-80f9-474b-aece-681f238af730\" (UID: \"7ba10cce-80f9-474b-aece-681f238af730\") " Oct 02 11:36:02 crc kubenswrapper[4658]: I1002 11:36:02.989333 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnfth\" (UniqueName: \"kubernetes.io/projected/721f38ad-db77-4b36-aa92-0c5ea5821709-kube-api-access-rnfth\") pod \"721f38ad-db77-4b36-aa92-0c5ea5821709\" (UID: \"721f38ad-db77-4b36-aa92-0c5ea5821709\") " Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.006602 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/721f38ad-db77-4b36-aa92-0c5ea5821709-kube-api-access-rnfth" (OuterVolumeSpecName: "kube-api-access-rnfth") pod "721f38ad-db77-4b36-aa92-0c5ea5821709" (UID: "721f38ad-db77-4b36-aa92-0c5ea5821709"). InnerVolumeSpecName "kube-api-access-rnfth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.011765 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba10cce-80f9-474b-aece-681f238af730-kube-api-access-h4ph5" (OuterVolumeSpecName: "kube-api-access-h4ph5") pod "7ba10cce-80f9-474b-aece-681f238af730" (UID: "7ba10cce-80f9-474b-aece-681f238af730"). InnerVolumeSpecName "kube-api-access-h4ph5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.091394 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4ph5\" (UniqueName: \"kubernetes.io/projected/7ba10cce-80f9-474b-aece-681f238af730-kube-api-access-h4ph5\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.091438 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnfth\" (UniqueName: \"kubernetes.io/projected/721f38ad-db77-4b36-aa92-0c5ea5821709-kube-api-access-rnfth\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.336042 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.440512 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kf47n"] Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.440806 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-kf47n" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" containerName="dnsmasq-dns" containerID="cri-o://643f7791d269fb64976908c814c130765ee1b587b4f3901db7bc1580d007f634" gracePeriod=10 Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.898048 4658 generic.go:334] "Generic (PLEG): container finished" podID="abdb68cf-d07d-4e71-9489-236c44f58641" containerID="643f7791d269fb64976908c814c130765ee1b587b4f3901db7bc1580d007f634" exitCode=0 Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.898147 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pd2cm" Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.907711 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kf47n" event={"ID":"abdb68cf-d07d-4e71-9489-236c44f58641","Type":"ContainerDied","Data":"643f7791d269fb64976908c814c130765ee1b587b4f3901db7bc1580d007f634"} Oct 02 11:36:03 crc kubenswrapper[4658]: I1002 11:36:03.907834 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g7rkz" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.367263 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d0c3-account-create-kbt5q"] Oct 02 11:36:04 crc kubenswrapper[4658]: E1002 11:36:04.367706 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721f38ad-db77-4b36-aa92-0c5ea5821709" containerName="mariadb-database-create" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.367732 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="721f38ad-db77-4b36-aa92-0c5ea5821709" containerName="mariadb-database-create" Oct 02 11:36:04 crc kubenswrapper[4658]: E1002 11:36:04.367757 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ba0196-c5f9-40ea-b43b-24c2d9e9ad60" containerName="mariadb-database-create" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.367767 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ba0196-c5f9-40ea-b43b-24c2d9e9ad60" containerName="mariadb-database-create" Oct 02 11:36:04 crc kubenswrapper[4658]: E1002 11:36:04.367787 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba10cce-80f9-474b-aece-681f238af730" containerName="mariadb-database-create" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.367795 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba10cce-80f9-474b-aece-681f238af730" containerName="mariadb-database-create" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.368009 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba10cce-80f9-474b-aece-681f238af730" containerName="mariadb-database-create" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.368029 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ba0196-c5f9-40ea-b43b-24c2d9e9ad60" containerName="mariadb-database-create" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.368052 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="721f38ad-db77-4b36-aa92-0c5ea5821709" containerName="mariadb-database-create" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.368635 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d0c3-account-create-kbt5q" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.371348 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.376110 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d0c3-account-create-kbt5q"] Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.426442 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjmq6\" (UniqueName: \"kubernetes.io/projected/3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa-kube-api-access-pjmq6\") pod \"cinder-d0c3-account-create-kbt5q\" (UID: \"3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa\") " pod="openstack/cinder-d0c3-account-create-kbt5q" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.528057 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjmq6\" (UniqueName: \"kubernetes.io/projected/3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa-kube-api-access-pjmq6\") pod \"cinder-d0c3-account-create-kbt5q\" (UID: \"3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa\") " pod="openstack/cinder-d0c3-account-create-kbt5q" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.558244 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjmq6\" (UniqueName: \"kubernetes.io/projected/3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa-kube-api-access-pjmq6\") pod \"cinder-d0c3-account-create-kbt5q\" (UID: \"3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa\") " pod="openstack/cinder-d0c3-account-create-kbt5q" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.564077 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b069-account-create-mrfzg"] Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.565590 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b069-account-create-mrfzg" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.570327 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.583562 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b069-account-create-mrfzg"] Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.629583 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj5s6\" (UniqueName: \"kubernetes.io/projected/27f10f35-b78b-4238-8d60-917300aaa9ad-kube-api-access-bj5s6\") pod \"barbican-b069-account-create-mrfzg\" (UID: \"27f10f35-b78b-4238-8d60-917300aaa9ad\") " pod="openstack/barbican-b069-account-create-mrfzg" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.643771 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-kf47n" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.694666 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d0c3-account-create-kbt5q" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.731703 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj5s6\" (UniqueName: \"kubernetes.io/projected/27f10f35-b78b-4238-8d60-917300aaa9ad-kube-api-access-bj5s6\") pod \"barbican-b069-account-create-mrfzg\" (UID: \"27f10f35-b78b-4238-8d60-917300aaa9ad\") " pod="openstack/barbican-b069-account-create-mrfzg" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.753124 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj5s6\" (UniqueName: \"kubernetes.io/projected/27f10f35-b78b-4238-8d60-917300aaa9ad-kube-api-access-bj5s6\") pod \"barbican-b069-account-create-mrfzg\" (UID: \"27f10f35-b78b-4238-8d60-917300aaa9ad\") " pod="openstack/barbican-b069-account-create-mrfzg" Oct 02 11:36:04 crc kubenswrapper[4658]: I1002 11:36:04.922382 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b069-account-create-mrfzg" Oct 02 11:36:05 crc kubenswrapper[4658]: I1002 11:36:05.932630 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:05 crc kubenswrapper[4658]: I1002 11:36:05.935134 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:06 crc kubenswrapper[4658]: I1002 11:36:06.926687 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:09 crc kubenswrapper[4658]: I1002 11:36:09.641954 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-kf47n" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Oct 02 11:36:09 crc kubenswrapper[4658]: I1002 11:36:09.697874 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 11:36:09 crc kubenswrapper[4658]: I1002 11:36:09.698102 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="config-reloader" containerID="cri-o://7850102b9d52f56cfad95e272c498d00dedf5b4db2dbe19bbab57ec4b2866c53" gracePeriod=600 Oct 02 11:36:09 crc kubenswrapper[4658]: I1002 11:36:09.698190 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="prometheus" containerID="cri-o://fc3b2f4dfe83e80fe704616899693e37590bc6228277dd0553652f4e696fb474" gracePeriod=600 Oct 02 11:36:09 crc kubenswrapper[4658]: I1002 11:36:09.698246 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="thanos-sidecar" containerID="cri-o://5d254e9aed8fbb8e98c0cb6fa78fefc38da6ca376af8b0d540609edbf4aa86ae" gracePeriod=600 Oct 02 11:36:09 crc kubenswrapper[4658]: I1002 11:36:09.964710 4658 generic.go:334] "Generic (PLEG): container finished" podID="f4544e55-087c-4095-be50-820df44e0a48" containerID="fc3b2f4dfe83e80fe704616899693e37590bc6228277dd0553652f4e696fb474" exitCode=0 Oct 02 11:36:09 crc kubenswrapper[4658]: I1002 11:36:09.964981 4658 generic.go:334] "Generic (PLEG): container finished" podID="f4544e55-087c-4095-be50-820df44e0a48" containerID="5d254e9aed8fbb8e98c0cb6fa78fefc38da6ca376af8b0d540609edbf4aa86ae" exitCode=0 Oct 02 11:36:09 crc kubenswrapper[4658]: I1002 11:36:09.964806 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerDied","Data":"fc3b2f4dfe83e80fe704616899693e37590bc6228277dd0553652f4e696fb474"} Oct 02 11:36:09 crc kubenswrapper[4658]: I1002 11:36:09.965023 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerDied","Data":"5d254e9aed8fbb8e98c0cb6fa78fefc38da6ca376af8b0d540609edbf4aa86ae"} Oct 02 11:36:10 crc kubenswrapper[4658]: I1002 11:36:10.932771 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.113:9090/-/ready\": dial tcp 10.217.0.113:9090: connect: connection refused" Oct 02 11:36:10 crc kubenswrapper[4658]: I1002 11:36:10.977137 4658 generic.go:334] "Generic (PLEG): container finished" podID="f4544e55-087c-4095-be50-820df44e0a48" containerID="7850102b9d52f56cfad95e272c498d00dedf5b4db2dbe19bbab57ec4b2866c53" exitCode=0 Oct 02 11:36:10 crc kubenswrapper[4658]: I1002 11:36:10.977188 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerDied","Data":"7850102b9d52f56cfad95e272c498d00dedf5b4db2dbe19bbab57ec4b2866c53"} Oct 02 11:36:11 crc kubenswrapper[4658]: E1002 11:36:11.718461 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-keystone:current-podified" Oct 02 11:36:11 crc kubenswrapper[4658]: E1002 11:36:11.718826 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:keystone-db-sync,Image:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,Command:[/bin/bash],Args:[-c keystone-manage db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/keystone/keystone.conf,SubPath:keystone.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m4wrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42425,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42425,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-db-sync-cgghm_openstack(1bc08b85-172e-4a85-8c1a-dc6c713737fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:36:11 crc kubenswrapper[4658]: E1002 11:36:11.720581 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/keystone-db-sync-cgghm" podUID="1bc08b85-172e-4a85-8c1a-dc6c713737fd" Oct 02 11:36:12 crc kubenswrapper[4658]: E1002 11:36:12.006686 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-keystone:current-podified\\\"\"" pod="openstack/keystone-db-sync-cgghm" podUID="1bc08b85-172e-4a85-8c1a-dc6c713737fd" Oct 02 11:36:14 crc kubenswrapper[4658]: I1002 11:36:14.641760 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-kf47n" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Oct 02 11:36:14 crc kubenswrapper[4658]: I1002 11:36:14.642461 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:36:14 crc kubenswrapper[4658]: I1002 11:36:14.689740 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-be39-account-create-j7lbd"] Oct 02 11:36:14 crc kubenswrapper[4658]: I1002 11:36:14.691161 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-be39-account-create-j7lbd" Oct 02 11:36:14 crc kubenswrapper[4658]: I1002 11:36:14.693610 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 02 11:36:14 crc kubenswrapper[4658]: I1002 11:36:14.710047 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-be39-account-create-j7lbd"] Oct 02 11:36:14 crc kubenswrapper[4658]: I1002 11:36:14.795161 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnlk6\" (UniqueName: \"kubernetes.io/projected/08a11803-3f64-4028-b71d-bab0c3e89ec3-kube-api-access-xnlk6\") pod \"neutron-be39-account-create-j7lbd\" (UID: \"08a11803-3f64-4028-b71d-bab0c3e89ec3\") " pod="openstack/neutron-be39-account-create-j7lbd" Oct 02 11:36:14 crc kubenswrapper[4658]: I1002 11:36:14.897468 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnlk6\" (UniqueName: \"kubernetes.io/projected/08a11803-3f64-4028-b71d-bab0c3e89ec3-kube-api-access-xnlk6\") pod \"neutron-be39-account-create-j7lbd\" (UID: \"08a11803-3f64-4028-b71d-bab0c3e89ec3\") " pod="openstack/neutron-be39-account-create-j7lbd" Oct 02 11:36:14 crc kubenswrapper[4658]: I1002 11:36:14.919608 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnlk6\" (UniqueName: \"kubernetes.io/projected/08a11803-3f64-4028-b71d-bab0c3e89ec3-kube-api-access-xnlk6\") pod \"neutron-be39-account-create-j7lbd\" (UID: \"08a11803-3f64-4028-b71d-bab0c3e89ec3\") " pod="openstack/neutron-be39-account-create-j7lbd" Oct 02 11:36:15 crc kubenswrapper[4658]: I1002 11:36:15.012895 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-be39-account-create-j7lbd" Oct 02 11:36:15 crc kubenswrapper[4658]: I1002 11:36:15.932216 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.113:9090/-/ready\": dial tcp 10.217.0.113:9090: connect: connection refused" Oct 02 11:36:18 crc kubenswrapper[4658]: E1002 11:36:18.052864 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Oct 02 11:36:18 crc kubenswrapper[4658]: E1002 11:36:18.053390 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqzz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-kbvvc_openstack(87a291e0-0291-4591-8d80-818338d6ae2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:36:18 crc kubenswrapper[4658]: E1002 11:36:18.054562 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-kbvvc" podUID="87a291e0-0291-4591-8d80-818338d6ae2d" Oct 02 11:36:18 crc kubenswrapper[4658]: E1002 11:36:18.443572 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.134:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Oct 02 11:36:18 crc kubenswrapper[4658]: E1002 11:36:18.443864 4658 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.134:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Oct 02 11:36:18 crc kubenswrapper[4658]: E1002 11:36:18.443996 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:38.129.56.134:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwqhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-ml5sj_openstack(efa1ebca-0cdd-4bce-adf2-e8273c3448f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:36:18 crc kubenswrapper[4658]: E1002 11:36:18.445709 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-db-sync-ml5sj" podUID="efa1ebca-0cdd-4bce-adf2-e8273c3448f1" Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.843985 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.851206 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.966482 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d0c3-account-create-kbt5q"] Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.977829 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-tls-assets\") pod \"f4544e55-087c-4095-be50-820df44e0a48\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.977933 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6mbx\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-kube-api-access-h6mbx\") pod \"f4544e55-087c-4095-be50-820df44e0a48\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.977983 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-thanos-prometheus-http-client-file\") pod \"f4544e55-087c-4095-be50-820df44e0a48\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978091 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4544e55-087c-4095-be50-820df44e0a48-config-out\") pod \"f4544e55-087c-4095-be50-820df44e0a48\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978121 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-sb\") pod \"abdb68cf-d07d-4e71-9489-236c44f58641\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978145 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-config\") pod \"abdb68cf-d07d-4e71-9489-236c44f58641\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978190 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-web-config\") pod \"f4544e55-087c-4095-be50-820df44e0a48\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978218 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4g7c\" (UniqueName: \"kubernetes.io/projected/abdb68cf-d07d-4e71-9489-236c44f58641-kube-api-access-m4g7c\") pod \"abdb68cf-d07d-4e71-9489-236c44f58641\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978312 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-dns-svc\") pod \"abdb68cf-d07d-4e71-9489-236c44f58641\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978333 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-config\") pod \"f4544e55-087c-4095-be50-820df44e0a48\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978366 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-nb\") pod \"abdb68cf-d07d-4e71-9489-236c44f58641\" (UID: \"abdb68cf-d07d-4e71-9489-236c44f58641\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978652 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"f4544e55-087c-4095-be50-820df44e0a48\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.978691 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f4544e55-087c-4095-be50-820df44e0a48-prometheus-metric-storage-rulefiles-0\") pod \"f4544e55-087c-4095-be50-820df44e0a48\" (UID: \"f4544e55-087c-4095-be50-820df44e0a48\") " Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.979805 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4544e55-087c-4095-be50-820df44e0a48-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "f4544e55-087c-4095-be50-820df44e0a48" (UID: "f4544e55-087c-4095-be50-820df44e0a48"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.985947 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b069-account-create-mrfzg"] Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.986247 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abdb68cf-d07d-4e71-9489-236c44f58641-kube-api-access-m4g7c" (OuterVolumeSpecName: "kube-api-access-m4g7c") pod "abdb68cf-d07d-4e71-9489-236c44f58641" (UID: "abdb68cf-d07d-4e71-9489-236c44f58641"). InnerVolumeSpecName "kube-api-access-m4g7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.988807 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f4544e55-087c-4095-be50-820df44e0a48" (UID: "f4544e55-087c-4095-be50-820df44e0a48"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.989433 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-config" (OuterVolumeSpecName: "config") pod "f4544e55-087c-4095-be50-820df44e0a48" (UID: "f4544e55-087c-4095-be50-820df44e0a48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.990272 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4544e55-087c-4095-be50-820df44e0a48-config-out" (OuterVolumeSpecName: "config-out") pod "f4544e55-087c-4095-be50-820df44e0a48" (UID: "f4544e55-087c-4095-be50-820df44e0a48"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:36:18 crc kubenswrapper[4658]: I1002 11:36:18.992828 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-kube-api-access-h6mbx" (OuterVolumeSpecName: "kube-api-access-h6mbx") pod "f4544e55-087c-4095-be50-820df44e0a48" (UID: "f4544e55-087c-4095-be50-820df44e0a48"). InnerVolumeSpecName "kube-api-access-h6mbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.021225 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-be39-account-create-j7lbd"] Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.025379 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "f4544e55-087c-4095-be50-820df44e0a48" (UID: "f4544e55-087c-4095-be50-820df44e0a48"). InnerVolumeSpecName "pvc-d812d300-651e-49c4-ad99-6713da3d5cbd". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.025525 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f4544e55-087c-4095-be50-820df44e0a48" (UID: "f4544e55-087c-4095-be50-820df44e0a48"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.078652 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-web-config" (OuterVolumeSpecName: "web-config") pod "f4544e55-087c-4095-be50-820df44e0a48" (UID: "f4544e55-087c-4095-be50-820df44e0a48"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.080805 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.080858 4658 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") on node \"crc\" " Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.080876 4658 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f4544e55-087c-4095-be50-820df44e0a48-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.080889 4658 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.080902 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6mbx\" (UniqueName: \"kubernetes.io/projected/f4544e55-087c-4095-be50-820df44e0a48-kube-api-access-h6mbx\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.080913 4658 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.080923 4658 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4544e55-087c-4095-be50-820df44e0a48-config-out\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.080936 4658 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4544e55-087c-4095-be50-820df44e0a48-web-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.080946 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4g7c\" (UniqueName: \"kubernetes.io/projected/abdb68cf-d07d-4e71-9489-236c44f58641-kube-api-access-m4g7c\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.081201 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "abdb68cf-d07d-4e71-9489-236c44f58641" (UID: "abdb68cf-d07d-4e71-9489-236c44f58641"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.087890 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-config" (OuterVolumeSpecName: "config") pod "abdb68cf-d07d-4e71-9489-236c44f58641" (UID: "abdb68cf-d07d-4e71-9489-236c44f58641"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.092531 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "abdb68cf-d07d-4e71-9489-236c44f58641" (UID: "abdb68cf-d07d-4e71-9489-236c44f58641"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.099271 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kf47n" event={"ID":"abdb68cf-d07d-4e71-9489-236c44f58641","Type":"ContainerDied","Data":"ab4586289fb9b0ece49b57be10f63f4276688d10a12a58375cdc6baf24eba721"} Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.099359 4658 scope.go:117] "RemoveContainer" containerID="643f7791d269fb64976908c814c130765ee1b587b4f3901db7bc1580d007f634" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.099582 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kf47n" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.101310 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d0c3-account-create-kbt5q" event={"ID":"3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa","Type":"ContainerStarted","Data":"7dfaa954dda71399dd49ca9ead43b7b809daac746d46f16aec134da6679871a1"} Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.104929 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "abdb68cf-d07d-4e71-9489-236c44f58641" (UID: "abdb68cf-d07d-4e71-9489-236c44f58641"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.107907 4658 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.108056 4658 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d812d300-651e-49c4-ad99-6713da3d5cbd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd") on node "crc" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.109075 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4544e55-087c-4095-be50-820df44e0a48","Type":"ContainerDied","Data":"45ce70dfa87bf45be43bea17eccb78fc18bde1c7dd85e0383b8de78b862643e2"} Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.109206 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.112001 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-be39-account-create-j7lbd" event={"ID":"08a11803-3f64-4028-b71d-bab0c3e89ec3","Type":"ContainerStarted","Data":"dda6bdc91b6c5e36435c4f626d0f63d30ed0a26e03afea4ea6a3070a0857019f"} Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.113450 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b069-account-create-mrfzg" event={"ID":"27f10f35-b78b-4238-8d60-917300aaa9ad","Type":"ContainerStarted","Data":"0f0b4534918826aafffc42b0515c21662ee4c51880a4ae7cbc3258dbc110c072"} Oct 02 11:36:19 crc kubenswrapper[4658]: E1002 11:36:19.115563 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-kbvvc" podUID="87a291e0-0291-4591-8d80-818338d6ae2d" Oct 02 11:36:19 crc kubenswrapper[4658]: E1002 11:36:19.115653 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.134:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest\\\"\"" pod="openstack/watcher-db-sync-ml5sj" podUID="efa1ebca-0cdd-4bce-adf2-e8273c3448f1" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.144493 4658 scope.go:117] "RemoveContainer" containerID="7ca5fb5c5ef318f07b1c5526af2b9ad68ed85d4da8d94a8f89735b2cb09bc1d5" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.190028 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.190069 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.190083 4658 reconciler_common.go:293] "Volume detached for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.190095 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.190109 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abdb68cf-d07d-4e71-9489-236c44f58641-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.228228 4658 scope.go:117] "RemoveContainer" containerID="fc3b2f4dfe83e80fe704616899693e37590bc6228277dd0553652f4e696fb474" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.248750 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.264976 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.280635 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 11:36:19 crc kubenswrapper[4658]: E1002 11:36:19.281127 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" containerName="dnsmasq-dns" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.281440 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" containerName="dnsmasq-dns" Oct 02 11:36:19 crc kubenswrapper[4658]: E1002 11:36:19.281479 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="init-config-reloader" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.281486 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="init-config-reloader" Oct 02 11:36:19 crc kubenswrapper[4658]: E1002 11:36:19.281496 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="thanos-sidecar" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.281504 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="thanos-sidecar" Oct 02 11:36:19 crc kubenswrapper[4658]: E1002 11:36:19.281514 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="config-reloader" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.281520 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="config-reloader" Oct 02 11:36:19 crc kubenswrapper[4658]: E1002 11:36:19.281529 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" containerName="init" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.281535 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" containerName="init" Oct 02 11:36:19 crc kubenswrapper[4658]: E1002 11:36:19.281607 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="prometheus" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.282427 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="prometheus" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.282613 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="thanos-sidecar" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.282631 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" containerName="dnsmasq-dns" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.282646 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="prometheus" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.282664 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4544e55-087c-4095-be50-820df44e0a48" containerName="config-reloader" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.284825 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.288826 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.289355 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5lk2c" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.289478 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.289599 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.289967 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.290172 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.295483 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.297896 4658 scope.go:117] "RemoveContainer" containerID="5d254e9aed8fbb8e98c0cb6fa78fefc38da6ca376af8b0d540609edbf4aa86ae" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.300142 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.338453 4658 scope.go:117] "RemoveContainer" containerID="7850102b9d52f56cfad95e272c498d00dedf5b4db2dbe19bbab57ec4b2866c53" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.359038 4658 scope.go:117] "RemoveContainer" containerID="b1679924fa14ef08b2595b7568d88e7f15b09384631dbf3c1591288012ee5b6d" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.396989 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397035 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397071 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-config\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397088 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63b3416a-79b7-450d-a7aa-42c1747c5c55-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397118 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs92g\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-kube-api-access-xs92g\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397184 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397216 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397273 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397305 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397411 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.397447 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63b3416a-79b7-450d-a7aa-42c1747c5c55-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.498883 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs92g\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-kube-api-access-xs92g\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499030 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499083 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499136 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499165 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499194 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499234 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63b3416a-79b7-450d-a7aa-42c1747c5c55-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499260 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499287 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499349 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-config\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.499376 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63b3416a-79b7-450d-a7aa-42c1747c5c55-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.500951 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63b3416a-79b7-450d-a7aa-42c1747c5c55-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.504858 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-config\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.505085 4658 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.505113 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b2727623f8bbe474018a880a77329ded2fae90762c86c59a9726b562d3cbf13f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.506801 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.507026 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.507379 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.510835 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.511340 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.511589 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.511602 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63b3416a-79b7-450d-a7aa-42c1747c5c55-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.516710 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs92g\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-kube-api-access-xs92g\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.540882 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.610590 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.691147 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kf47n"] Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.699218 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kf47n"] Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.961496 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abdb68cf-d07d-4e71-9489-236c44f58641" path="/var/lib/kubelet/pods/abdb68cf-d07d-4e71-9489-236c44f58641/volumes" Oct 02 11:36:19 crc kubenswrapper[4658]: I1002 11:36:19.962952 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4544e55-087c-4095-be50-820df44e0a48" path="/var/lib/kubelet/pods/f4544e55-087c-4095-be50-820df44e0a48/volumes" Oct 02 11:36:20 crc kubenswrapper[4658]: I1002 11:36:20.064884 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 11:36:20 crc kubenswrapper[4658]: W1002 11:36:20.069750 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63b3416a_79b7_450d_a7aa_42c1747c5c55.slice/crio-594ac99825969811f912c326bec4d19b5d12b4d2a44d82bbcb00083665d5a981 WatchSource:0}: Error finding container 594ac99825969811f912c326bec4d19b5d12b4d2a44d82bbcb00083665d5a981: Status 404 returned error can't find the container with id 594ac99825969811f912c326bec4d19b5d12b4d2a44d82bbcb00083665d5a981 Oct 02 11:36:20 crc kubenswrapper[4658]: I1002 11:36:20.134715 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerStarted","Data":"594ac99825969811f912c326bec4d19b5d12b4d2a44d82bbcb00083665d5a981"} Oct 02 11:36:20 crc kubenswrapper[4658]: I1002 11:36:20.137201 4658 generic.go:334] "Generic (PLEG): container finished" podID="3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa" containerID="654464221e8c58a97a0ab93c59e1723d53118315d2111f02fa611eb5b393d6e2" exitCode=0 Oct 02 11:36:20 crc kubenswrapper[4658]: I1002 11:36:20.137255 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d0c3-account-create-kbt5q" event={"ID":"3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa","Type":"ContainerDied","Data":"654464221e8c58a97a0ab93c59e1723d53118315d2111f02fa611eb5b393d6e2"} Oct 02 11:36:20 crc kubenswrapper[4658]: I1002 11:36:20.141320 4658 generic.go:334] "Generic (PLEG): container finished" podID="08a11803-3f64-4028-b71d-bab0c3e89ec3" containerID="b239ce3207f783423497337b429199e1b0dbb924305ae106452a2eb4f3cec4b9" exitCode=0 Oct 02 11:36:20 crc kubenswrapper[4658]: I1002 11:36:20.141370 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-be39-account-create-j7lbd" event={"ID":"08a11803-3f64-4028-b71d-bab0c3e89ec3","Type":"ContainerDied","Data":"b239ce3207f783423497337b429199e1b0dbb924305ae106452a2eb4f3cec4b9"} Oct 02 11:36:20 crc kubenswrapper[4658]: I1002 11:36:20.143352 4658 generic.go:334] "Generic (PLEG): container finished" podID="27f10f35-b78b-4238-8d60-917300aaa9ad" containerID="bb90bf4685877447040efedd5233a6b5c92c4c0acda94e1a5dd0473a65be42f7" exitCode=0 Oct 02 11:36:20 crc kubenswrapper[4658]: I1002 11:36:20.143382 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b069-account-create-mrfzg" event={"ID":"27f10f35-b78b-4238-8d60-917300aaa9ad","Type":"ContainerDied","Data":"bb90bf4685877447040efedd5233a6b5c92c4c0acda94e1a5dd0473a65be42f7"} Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.593615 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b069-account-create-mrfzg" Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.636032 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj5s6\" (UniqueName: \"kubernetes.io/projected/27f10f35-b78b-4238-8d60-917300aaa9ad-kube-api-access-bj5s6\") pod \"27f10f35-b78b-4238-8d60-917300aaa9ad\" (UID: \"27f10f35-b78b-4238-8d60-917300aaa9ad\") " Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.658500 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f10f35-b78b-4238-8d60-917300aaa9ad-kube-api-access-bj5s6" (OuterVolumeSpecName: "kube-api-access-bj5s6") pod "27f10f35-b78b-4238-8d60-917300aaa9ad" (UID: "27f10f35-b78b-4238-8d60-917300aaa9ad"). InnerVolumeSpecName "kube-api-access-bj5s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.717526 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-be39-account-create-j7lbd" Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.717998 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d0c3-account-create-kbt5q" Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.741496 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj5s6\" (UniqueName: \"kubernetes.io/projected/27f10f35-b78b-4238-8d60-917300aaa9ad-kube-api-access-bj5s6\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.842215 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnlk6\" (UniqueName: \"kubernetes.io/projected/08a11803-3f64-4028-b71d-bab0c3e89ec3-kube-api-access-xnlk6\") pod \"08a11803-3f64-4028-b71d-bab0c3e89ec3\" (UID: \"08a11803-3f64-4028-b71d-bab0c3e89ec3\") " Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.842265 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjmq6\" (UniqueName: \"kubernetes.io/projected/3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa-kube-api-access-pjmq6\") pod \"3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa\" (UID: \"3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa\") " Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.851849 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a11803-3f64-4028-b71d-bab0c3e89ec3-kube-api-access-xnlk6" (OuterVolumeSpecName: "kube-api-access-xnlk6") pod "08a11803-3f64-4028-b71d-bab0c3e89ec3" (UID: "08a11803-3f64-4028-b71d-bab0c3e89ec3"). InnerVolumeSpecName "kube-api-access-xnlk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.853805 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa-kube-api-access-pjmq6" (OuterVolumeSpecName: "kube-api-access-pjmq6") pod "3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa" (UID: "3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa"). InnerVolumeSpecName "kube-api-access-pjmq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.944330 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnlk6\" (UniqueName: \"kubernetes.io/projected/08a11803-3f64-4028-b71d-bab0c3e89ec3-kube-api-access-xnlk6\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:21 crc kubenswrapper[4658]: I1002 11:36:21.944601 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjmq6\" (UniqueName: \"kubernetes.io/projected/3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa-kube-api-access-pjmq6\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:22 crc kubenswrapper[4658]: I1002 11:36:22.161746 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-be39-account-create-j7lbd" event={"ID":"08a11803-3f64-4028-b71d-bab0c3e89ec3","Type":"ContainerDied","Data":"dda6bdc91b6c5e36435c4f626d0f63d30ed0a26e03afea4ea6a3070a0857019f"} Oct 02 11:36:22 crc kubenswrapper[4658]: I1002 11:36:22.161794 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda6bdc91b6c5e36435c4f626d0f63d30ed0a26e03afea4ea6a3070a0857019f" Oct 02 11:36:22 crc kubenswrapper[4658]: I1002 11:36:22.161854 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-be39-account-create-j7lbd" Oct 02 11:36:22 crc kubenswrapper[4658]: I1002 11:36:22.164575 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b069-account-create-mrfzg" event={"ID":"27f10f35-b78b-4238-8d60-917300aaa9ad","Type":"ContainerDied","Data":"0f0b4534918826aafffc42b0515c21662ee4c51880a4ae7cbc3258dbc110c072"} Oct 02 11:36:22 crc kubenswrapper[4658]: I1002 11:36:22.164609 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f0b4534918826aafffc42b0515c21662ee4c51880a4ae7cbc3258dbc110c072" Oct 02 11:36:22 crc kubenswrapper[4658]: I1002 11:36:22.164658 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b069-account-create-mrfzg" Oct 02 11:36:22 crc kubenswrapper[4658]: I1002 11:36:22.166958 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d0c3-account-create-kbt5q" event={"ID":"3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa","Type":"ContainerDied","Data":"7dfaa954dda71399dd49ca9ead43b7b809daac746d46f16aec134da6679871a1"} Oct 02 11:36:22 crc kubenswrapper[4658]: I1002 11:36:22.166982 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dfaa954dda71399dd49ca9ead43b7b809daac746d46f16aec134da6679871a1" Oct 02 11:36:22 crc kubenswrapper[4658]: I1002 11:36:22.167048 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d0c3-account-create-kbt5q" Oct 02 11:36:23 crc kubenswrapper[4658]: I1002 11:36:23.176474 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerStarted","Data":"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde"} Oct 02 11:36:24 crc kubenswrapper[4658]: I1002 11:36:24.186935 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cgghm" event={"ID":"1bc08b85-172e-4a85-8c1a-dc6c713737fd","Type":"ContainerStarted","Data":"5d729cb5a7c5d82ac7670551e6f885beb68cc768a5939d3bd7e8d544a555ecaa"} Oct 02 11:36:27 crc kubenswrapper[4658]: I1002 11:36:27.220185 4658 generic.go:334] "Generic (PLEG): container finished" podID="1bc08b85-172e-4a85-8c1a-dc6c713737fd" containerID="5d729cb5a7c5d82ac7670551e6f885beb68cc768a5939d3bd7e8d544a555ecaa" exitCode=0 Oct 02 11:36:27 crc kubenswrapper[4658]: I1002 11:36:27.220278 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cgghm" event={"ID":"1bc08b85-172e-4a85-8c1a-dc6c713737fd","Type":"ContainerDied","Data":"5d729cb5a7c5d82ac7670551e6f885beb68cc768a5939d3bd7e8d544a555ecaa"} Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.575264 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cgghm" Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.661290 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-config-data\") pod \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.661508 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4wrk\" (UniqueName: \"kubernetes.io/projected/1bc08b85-172e-4a85-8c1a-dc6c713737fd-kube-api-access-m4wrk\") pod \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.661581 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-combined-ca-bundle\") pod \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\" (UID: \"1bc08b85-172e-4a85-8c1a-dc6c713737fd\") " Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.667276 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc08b85-172e-4a85-8c1a-dc6c713737fd-kube-api-access-m4wrk" (OuterVolumeSpecName: "kube-api-access-m4wrk") pod "1bc08b85-172e-4a85-8c1a-dc6c713737fd" (UID: "1bc08b85-172e-4a85-8c1a-dc6c713737fd"). InnerVolumeSpecName "kube-api-access-m4wrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.687354 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bc08b85-172e-4a85-8c1a-dc6c713737fd" (UID: "1bc08b85-172e-4a85-8c1a-dc6c713737fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.708553 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-config-data" (OuterVolumeSpecName: "config-data") pod "1bc08b85-172e-4a85-8c1a-dc6c713737fd" (UID: "1bc08b85-172e-4a85-8c1a-dc6c713737fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.763959 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4wrk\" (UniqueName: \"kubernetes.io/projected/1bc08b85-172e-4a85-8c1a-dc6c713737fd-kube-api-access-m4wrk\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.764017 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:28 crc kubenswrapper[4658]: I1002 11:36:28.764030 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc08b85-172e-4a85-8c1a-dc6c713737fd-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.236990 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cgghm" event={"ID":"1bc08b85-172e-4a85-8c1a-dc6c713737fd","Type":"ContainerDied","Data":"842129441d60886525e432165eabdadc9921a4b45e495402c9cc7d46d7870475"} Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.237317 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="842129441d60886525e432165eabdadc9921a4b45e495402c9cc7d46d7870475" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.237007 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cgghm" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.238473 4658 generic.go:334] "Generic (PLEG): container finished" podID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerID="82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde" exitCode=0 Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.238498 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerDied","Data":"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde"} Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.505015 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-57zdb"] Oct 02 11:36:29 crc kubenswrapper[4658]: E1002 11:36:29.505622 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc08b85-172e-4a85-8c1a-dc6c713737fd" containerName="keystone-db-sync" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.505640 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc08b85-172e-4a85-8c1a-dc6c713737fd" containerName="keystone-db-sync" Oct 02 11:36:29 crc kubenswrapper[4658]: E1002 11:36:29.505654 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f10f35-b78b-4238-8d60-917300aaa9ad" containerName="mariadb-account-create" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.505663 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f10f35-b78b-4238-8d60-917300aaa9ad" containerName="mariadb-account-create" Oct 02 11:36:29 crc kubenswrapper[4658]: E1002 11:36:29.505689 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa" containerName="mariadb-account-create" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.505696 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa" containerName="mariadb-account-create" Oct 02 11:36:29 crc kubenswrapper[4658]: E1002 11:36:29.505713 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a11803-3f64-4028-b71d-bab0c3e89ec3" containerName="mariadb-account-create" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.505718 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a11803-3f64-4028-b71d-bab0c3e89ec3" containerName="mariadb-account-create" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.505889 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a11803-3f64-4028-b71d-bab0c3e89ec3" containerName="mariadb-account-create" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.505903 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa" containerName="mariadb-account-create" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.505920 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc08b85-172e-4a85-8c1a-dc6c713737fd" containerName="keystone-db-sync" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.505929 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f10f35-b78b-4238-8d60-917300aaa9ad" containerName="mariadb-account-create" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.517796 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.519641 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-57zdb"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.586413 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sqr7t"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.587874 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.602272 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.602507 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.602657 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bltpc" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.602832 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.609769 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-svc\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.609819 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx4fp\" (UniqueName: \"kubernetes.io/projected/e8653761-102b-4879-ba09-1b263a960052-kube-api-access-hx4fp\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.609843 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-config\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.609980 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.610002 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.610023 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.612799 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sqr7t"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.712760 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-config\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.712806 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-scripts\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.712852 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-config-data\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.712872 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-fernet-keys\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.712946 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-combined-ca-bundle\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.712982 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.712999 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.713017 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.713073 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-credential-keys\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.713107 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46ph5\" (UniqueName: \"kubernetes.io/projected/8797d11d-3098-476c-9273-b62dc97e1558-kube-api-access-46ph5\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.713143 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-svc\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.713169 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx4fp\" (UniqueName: \"kubernetes.io/projected/e8653761-102b-4879-ba09-1b263a960052-kube-api-access-hx4fp\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.714170 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.714677 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.715400 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.716270 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-svc\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.716584 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-config\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.759185 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-s6w77"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.760397 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.763842 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.765094 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2d764" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.770438 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.782985 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-78b685455c-5zn4s"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.783490 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx4fp\" (UniqueName: \"kubernetes.io/projected/e8653761-102b-4879-ba09-1b263a960052-kube-api-access-hx4fp\") pod \"dnsmasq-dns-55fff446b9-57zdb\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.784347 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.793558 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s6w77"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.798839 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.799066 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.799231 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-jn7w6" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.799389 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.807791 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78b685455c-5zn4s"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.816767 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-config-data\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.816803 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-scripts\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.816832 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-config-data\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.816851 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-fernet-keys\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.816868 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-db-sync-config-data\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.816914 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf8r5\" (UniqueName: \"kubernetes.io/projected/6378c687-5c50-4efd-8cc5-b7aa4ef82297-kube-api-access-lf8r5\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.816933 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-combined-ca-bundle\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.816954 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6378c687-5c50-4efd-8cc5-b7aa4ef82297-etc-machine-id\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.816970 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-scripts\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.817007 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-credential-keys\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.817024 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-combined-ca-bundle\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.817049 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46ph5\" (UniqueName: \"kubernetes.io/projected/8797d11d-3098-476c-9273-b62dc97e1558-kube-api-access-46ph5\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.822049 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-combined-ca-bundle\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.830473 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-scripts\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.839058 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-config-data\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.839781 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-credential-keys\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.844882 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-fernet-keys\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.862532 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-57zdb"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.863148 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.863942 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46ph5\" (UniqueName: \"kubernetes.io/projected/8797d11d-3098-476c-9273-b62dc97e1558-kube-api-access-46ph5\") pod \"keystone-bootstrap-sqr7t\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.910431 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-d5ppn"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.911891 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.920779 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-d5ppn"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.920906 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.921116 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mwbdb" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922397 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/822259c6-fea2-44cb-9a09-d6415a92e71e-horizon-secret-key\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922464 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-combined-ca-bundle\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922541 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-scripts\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922595 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-config-data\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922622 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5rt\" (UniqueName: \"kubernetes.io/projected/822259c6-fea2-44cb-9a09-d6415a92e71e-kube-api-access-df5rt\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922700 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-db-sync-config-data\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922796 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822259c6-fea2-44cb-9a09-d6415a92e71e-logs\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922836 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf8r5\" (UniqueName: \"kubernetes.io/projected/6378c687-5c50-4efd-8cc5-b7aa4ef82297-kube-api-access-lf8r5\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922881 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6378c687-5c50-4efd-8cc5-b7aa4ef82297-etc-machine-id\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922906 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-scripts\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.922939 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-config-data\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.925021 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6378c687-5c50-4efd-8cc5-b7aa4ef82297-etc-machine-id\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.931109 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-db-sync-config-data\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.932478 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-scripts\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.934006 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-combined-ca-bundle\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.938583 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-t2lfk"] Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.941556 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.946788 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-config-data\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:29 crc kubenswrapper[4658]: I1002 11:36:29.984103 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.005533 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf8r5\" (UniqueName: \"kubernetes.io/projected/6378c687-5c50-4efd-8cc5-b7aa4ef82297-kube-api-access-lf8r5\") pod \"cinder-db-sync-s6w77\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.025654 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822259c6-fea2-44cb-9a09-d6415a92e71e-logs\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.025715 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.025797 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-config-data\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.025830 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.025855 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-combined-ca-bundle\") pod \"barbican-db-sync-d5ppn\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.025890 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/822259c6-fea2-44cb-9a09-d6415a92e71e-horizon-secret-key\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.025956 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522tx\" (UniqueName: \"kubernetes.io/projected/057d8045-79f8-4f4d-9b29-ce1f517e0f94-kube-api-access-522tx\") pod \"barbican-db-sync-d5ppn\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.025984 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-scripts\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.026004 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.026061 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5rt\" (UniqueName: \"kubernetes.io/projected/822259c6-fea2-44cb-9a09-d6415a92e71e-kube-api-access-df5rt\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.026105 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.026130 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7dfv\" (UniqueName: \"kubernetes.io/projected/f667e839-3159-487f-af95-60818fdc1b84-kube-api-access-d7dfv\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.026162 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-db-sync-config-data\") pod \"barbican-db-sync-d5ppn\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.026184 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-config\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.026190 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822259c6-fea2-44cb-9a09-d6415a92e71e-logs\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.027689 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-scripts\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.028003 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-config-data\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.033516 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-t2lfk"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.033559 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jftqc"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.034912 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-9hqkv"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.035905 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jftqc"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.036016 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.036449 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.040649 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.040974 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p4wj8" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.042913 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.042935 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/822259c6-fea2-44cb-9a09-d6415a92e71e-horizon-secret-key\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.045457 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4r28x" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.045714 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.046820 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77b58fd99f-hfqr9"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.049792 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.050344 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.058635 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9hqkv"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.079170 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.081031 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.081277 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5rt\" (UniqueName: \"kubernetes.io/projected/822259c6-fea2-44cb-9a09-d6415a92e71e-kube-api-access-df5rt\") pod \"horizon-78b685455c-5zn4s\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.084264 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77b58fd99f-hfqr9"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.084446 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.084683 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.096884 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.142939 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-combined-ca-bundle\") pod \"neutron-db-sync-jftqc\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143037 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-522tx\" (UniqueName: \"kubernetes.io/projected/057d8045-79f8-4f4d-9b29-ce1f517e0f94-kube-api-access-522tx\") pod \"barbican-db-sync-d5ppn\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143107 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143142 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-logs\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143172 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-combined-ca-bundle\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143237 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-config-data\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143609 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmnlr\" (UniqueName: \"kubernetes.io/projected/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-kube-api-access-mmnlr\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143645 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143679 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7dfv\" (UniqueName: \"kubernetes.io/projected/f667e839-3159-487f-af95-60818fdc1b84-kube-api-access-d7dfv\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143727 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47s9v\" (UniqueName: \"kubernetes.io/projected/a4602160-442e-4a87-bacb-3493da6f4dad-kube-api-access-47s9v\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143782 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-db-sync-config-data\") pod \"barbican-db-sync-d5ppn\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143831 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-config\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143893 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-horizon-secret-key\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.143978 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-config-data\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.144116 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.144271 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k96sl\" (UniqueName: \"kubernetes.io/projected/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-kube-api-access-k96sl\") pod \"neutron-db-sync-jftqc\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.144343 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.144386 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-config\") pod \"neutron-db-sync-jftqc\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.144421 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-combined-ca-bundle\") pod \"barbican-db-sync-d5ppn\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.144526 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-scripts\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.144610 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4602160-442e-4a87-bacb-3493da6f4dad-logs\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.144649 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-scripts\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.146692 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.147263 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.161561 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.162271 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-config\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.169484 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.173790 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-combined-ca-bundle\") pod \"barbican-db-sync-d5ppn\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.177778 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-522tx\" (UniqueName: \"kubernetes.io/projected/057d8045-79f8-4f4d-9b29-ce1f517e0f94-kube-api-access-522tx\") pod \"barbican-db-sync-d5ppn\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.183465 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-db-sync-config-data\") pod \"barbican-db-sync-d5ppn\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.192098 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7dfv\" (UniqueName: \"kubernetes.io/projected/f667e839-3159-487f-af95-60818fdc1b84-kube-api-access-d7dfv\") pod \"dnsmasq-dns-76fcf4b695-t2lfk\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247559 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-config-data\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247615 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw9cx\" (UniqueName: \"kubernetes.io/projected/cd5709aa-c4aa-4577-b3cb-e518acf890f1-kube-api-access-cw9cx\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247642 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k96sl\" (UniqueName: \"kubernetes.io/projected/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-kube-api-access-k96sl\") pod \"neutron-db-sync-jftqc\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247664 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-config\") pod \"neutron-db-sync-jftqc\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247690 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-scripts\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247710 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247734 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4602160-442e-4a87-bacb-3493da6f4dad-logs\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247750 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-log-httpd\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247767 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-scripts\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247787 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-combined-ca-bundle\") pod \"neutron-db-sync-jftqc\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247807 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-logs\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247824 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-combined-ca-bundle\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247845 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-config-data\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247873 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-scripts\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247894 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmnlr\" (UniqueName: \"kubernetes.io/projected/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-kube-api-access-mmnlr\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247913 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47s9v\" (UniqueName: \"kubernetes.io/projected/a4602160-442e-4a87-bacb-3493da6f4dad-kube-api-access-47s9v\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247935 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-run-httpd\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247952 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-horizon-secret-key\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.247980 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.248001 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-config-data\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.248826 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-scripts\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.249125 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-config-data\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.250279 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-logs\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.252818 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-scripts\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.253540 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4602160-442e-4a87-bacb-3493da6f4dad-logs\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.254148 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-combined-ca-bundle\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.254252 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s6w77" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.263419 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-horizon-secret-key\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.265985 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-combined-ca-bundle\") pod \"neutron-db-sync-jftqc\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.269809 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-config-data\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.285232 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-config\") pod \"neutron-db-sync-jftqc\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.286445 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k96sl\" (UniqueName: \"kubernetes.io/projected/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-kube-api-access-k96sl\") pod \"neutron-db-sync-jftqc\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.288939 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.294198 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmnlr\" (UniqueName: \"kubernetes.io/projected/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-kube-api-access-mmnlr\") pod \"horizon-77b58fd99f-hfqr9\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.294199 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47s9v\" (UniqueName: \"kubernetes.io/projected/a4602160-442e-4a87-bacb-3493da6f4dad-kube-api-access-47s9v\") pod \"placement-db-sync-9hqkv\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.297768 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerStarted","Data":"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1"} Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.353885 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.354603 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-run-httpd\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.354665 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.354714 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-config-data\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.354741 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw9cx\" (UniqueName: \"kubernetes.io/projected/cd5709aa-c4aa-4577-b3cb-e518acf890f1-kube-api-access-cw9cx\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.354796 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.354840 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-log-httpd\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.354926 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-scripts\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.356629 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-run-httpd\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.358154 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-log-httpd\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.360949 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-scripts\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.364861 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-config-data\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.369924 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.375018 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.379708 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.380124 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw9cx\" (UniqueName: \"kubernetes.io/projected/cd5709aa-c4aa-4577-b3cb-e518acf890f1-kube-api-access-cw9cx\") pod \"ceilometer-0\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.404042 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9hqkv" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.454567 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jftqc" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.509242 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.524389 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.554587 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-57zdb"] Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.689312 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sqr7t"] Oct 02 11:36:30 crc kubenswrapper[4658]: W1002 11:36:30.709621 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8797d11d_3098_476c_9273_b62dc97e1558.slice/crio-76dd9d2608d830804848f507ca7b78cf13f1fde43c9e9a75544b51fa25b3b7b0 WatchSource:0}: Error finding container 76dd9d2608d830804848f507ca7b78cf13f1fde43c9e9a75544b51fa25b3b7b0: Status 404 returned error can't find the container with id 76dd9d2608d830804848f507ca7b78cf13f1fde43c9e9a75544b51fa25b3b7b0 Oct 02 11:36:30 crc kubenswrapper[4658]: I1002 11:36:30.831917 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s6w77"] Oct 02 11:36:31 crc kubenswrapper[4658]: W1002 11:36:31.096977 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6378c687_5c50_4efd_8cc5_b7aa4ef82297.slice/crio-75366299e74195ce02f15b2b31a8e505d22cd8715729074a9d788d1634beb1d0 WatchSource:0}: Error finding container 75366299e74195ce02f15b2b31a8e505d22cd8715729074a9d788d1634beb1d0: Status 404 returned error can't find the container with id 75366299e74195ce02f15b2b31a8e505d22cd8715729074a9d788d1634beb1d0 Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.191775 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-d5ppn"] Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.213272 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jftqc"] Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.228247 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78b685455c-5zn4s"] Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.309720 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-57zdb" event={"ID":"e8653761-102b-4879-ba09-1b263a960052","Type":"ContainerStarted","Data":"904c0427b5e2fa0a4953c62684fedbd26075cebdeea7362e7f8fa56d8c797873"} Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.309793 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-57zdb" event={"ID":"e8653761-102b-4879-ba09-1b263a960052","Type":"ContainerStarted","Data":"fbe257467de1498a68120ea77103feef118561622f9fd82418221207cbafcf7b"} Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.309815 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55fff446b9-57zdb" podUID="e8653761-102b-4879-ba09-1b263a960052" containerName="init" containerID="cri-o://904c0427b5e2fa0a4953c62684fedbd26075cebdeea7362e7f8fa56d8c797873" gracePeriod=10 Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.311791 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sqr7t" event={"ID":"8797d11d-3098-476c-9273-b62dc97e1558","Type":"ContainerStarted","Data":"76dd9d2608d830804848f507ca7b78cf13f1fde43c9e9a75544b51fa25b3b7b0"} Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.319011 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s6w77" event={"ID":"6378c687-5c50-4efd-8cc5-b7aa4ef82297","Type":"ContainerStarted","Data":"75366299e74195ce02f15b2b31a8e505d22cd8715729074a9d788d1634beb1d0"} Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.714680 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-t2lfk"] Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.782221 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77b58fd99f-hfqr9"] Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.813649 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9hqkv"] Oct 02 11:36:31 crc kubenswrapper[4658]: I1002 11:36:31.823864 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:36:32 crc kubenswrapper[4658]: W1002 11:36:32.160633 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6380c8f_2c75_46e1_b055_80c6f4ecdde5.slice/crio-607c7e9aca50e975a19e64f25fd8e4bebf9f3c1557404b65595f2b414a59f3a3 WatchSource:0}: Error finding container 607c7e9aca50e975a19e64f25fd8e4bebf9f3c1557404b65595f2b414a59f3a3: Status 404 returned error can't find the container with id 607c7e9aca50e975a19e64f25fd8e4bebf9f3c1557404b65595f2b414a59f3a3 Oct 02 11:36:32 crc kubenswrapper[4658]: W1002 11:36:32.163951 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd5709aa_c4aa_4577_b3cb_e518acf890f1.slice/crio-ed563ed6debbdbe889ef3cf32b42e9f82aa88c2dae4b5936a87d2ec4f4c0c628 WatchSource:0}: Error finding container ed563ed6debbdbe889ef3cf32b42e9f82aa88c2dae4b5936a87d2ec4f4c0c628: Status 404 returned error can't find the container with id ed563ed6debbdbe889ef3cf32b42e9f82aa88c2dae4b5936a87d2ec4f4c0c628 Oct 02 11:36:32 crc kubenswrapper[4658]: W1002 11:36:32.175822 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4602160_442e_4a87_bacb_3493da6f4dad.slice/crio-fb0a1641c88a61d611472c7628124f845c323260ac2683d0dab3e49137415f38 WatchSource:0}: Error finding container fb0a1641c88a61d611472c7628124f845c323260ac2683d0dab3e49137415f38: Status 404 returned error can't find the container with id fb0a1641c88a61d611472c7628124f845c323260ac2683d0dab3e49137415f38 Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.321872 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77b58fd99f-hfqr9"] Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.356677 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78b685455c-5zn4s" event={"ID":"822259c6-fea2-44cb-9a09-d6415a92e71e","Type":"ContainerStarted","Data":"385e17e93764f1535f16a15115d98f8494f9e38c1c0acf4eca4d8efe47dd3631"} Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.361338 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.383796 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9hqkv" event={"ID":"a4602160-442e-4a87-bacb-3493da6f4dad","Type":"ContainerStarted","Data":"fb0a1641c88a61d611472c7628124f845c323260ac2683d0dab3e49137415f38"} Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.407215 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d5ppn" event={"ID":"057d8045-79f8-4f4d-9b29-ce1f517e0f94","Type":"ContainerStarted","Data":"e526562e6c0492349e4a75d7da2b7d1bb6f24b1d65810a000abb43a147cf61f3"} Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.408833 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f5d4896d9-72pgt"] Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.423491 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.471528 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.472391 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" event={"ID":"f667e839-3159-487f-af95-60818fdc1b84","Type":"ContainerStarted","Data":"32607097163d44e1a6a5791b9a057abeb63dff4d2bf61a4850759596934a5f03"} Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.477096 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b58fd99f-hfqr9" event={"ID":"d6380c8f-2c75-46e1-b055-80c6f4ecdde5","Type":"ContainerStarted","Data":"607c7e9aca50e975a19e64f25fd8e4bebf9f3c1557404b65595f2b414a59f3a3"} Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.489895 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f5d4896d9-72pgt"] Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.509966 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-swift-storage-0\") pod \"e8653761-102b-4879-ba09-1b263a960052\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.510058 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-sb\") pod \"e8653761-102b-4879-ba09-1b263a960052\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.512058 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx4fp\" (UniqueName: \"kubernetes.io/projected/e8653761-102b-4879-ba09-1b263a960052-kube-api-access-hx4fp\") pod \"e8653761-102b-4879-ba09-1b263a960052\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.512127 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-config\") pod \"e8653761-102b-4879-ba09-1b263a960052\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.512763 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-nb\") pod \"e8653761-102b-4879-ba09-1b263a960052\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.512866 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-svc\") pod \"e8653761-102b-4879-ba09-1b263a960052\" (UID: \"e8653761-102b-4879-ba09-1b263a960052\") " Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.514744 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slw8v\" (UniqueName: \"kubernetes.io/projected/e4842274-d590-4208-8a18-892db2b9e824-kube-api-access-slw8v\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.515338 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-config-data\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.515537 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-scripts\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.515600 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4842274-d590-4208-8a18-892db2b9e824-logs\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.515647 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4842274-d590-4208-8a18-892db2b9e824-horizon-secret-key\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.519027 4658 generic.go:334] "Generic (PLEG): container finished" podID="e8653761-102b-4879-ba09-1b263a960052" containerID="904c0427b5e2fa0a4953c62684fedbd26075cebdeea7362e7f8fa56d8c797873" exitCode=0 Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.519137 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-57zdb" event={"ID":"e8653761-102b-4879-ba09-1b263a960052","Type":"ContainerDied","Data":"904c0427b5e2fa0a4953c62684fedbd26075cebdeea7362e7f8fa56d8c797873"} Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.519171 4658 scope.go:117] "RemoveContainer" containerID="904c0427b5e2fa0a4953c62684fedbd26075cebdeea7362e7f8fa56d8c797873" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.519334 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-57zdb" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.539792 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8653761-102b-4879-ba09-1b263a960052-kube-api-access-hx4fp" (OuterVolumeSpecName: "kube-api-access-hx4fp") pod "e8653761-102b-4879-ba09-1b263a960052" (UID: "e8653761-102b-4879-ba09-1b263a960052"). InnerVolumeSpecName "kube-api-access-hx4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.542371 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sqr7t" event={"ID":"8797d11d-3098-476c-9273-b62dc97e1558","Type":"ContainerStarted","Data":"8779db10fbef68937802a29919f9f9dd2cbddf39104c74c0fdd50ee885a5ca23"} Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.549098 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e8653761-102b-4879-ba09-1b263a960052" (UID: "e8653761-102b-4879-ba09-1b263a960052"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.556266 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jftqc" event={"ID":"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab","Type":"ContainerStarted","Data":"4a81197aa9c95a844236de94b8cdef32aca5f00b8f300721eff81d37ae62d63a"} Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.559202 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerStarted","Data":"ed563ed6debbdbe889ef3cf32b42e9f82aa88c2dae4b5936a87d2ec4f4c0c628"} Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.578508 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8653761-102b-4879-ba09-1b263a960052" (UID: "e8653761-102b-4879-ba09-1b263a960052"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.597491 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sqr7t" podStartSLOduration=3.597447994 podStartE2EDuration="3.597447994s" podCreationTimestamp="2025-10-02 11:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:36:32.57857364 +0000 UTC m=+1073.469727207" watchObservedRunningTime="2025-10-02 11:36:32.597447994 +0000 UTC m=+1073.488601561" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.618095 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slw8v\" (UniqueName: \"kubernetes.io/projected/e4842274-d590-4208-8a18-892db2b9e824-kube-api-access-slw8v\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.619220 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-config-data\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.619402 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-scripts\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.619494 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4842274-d590-4208-8a18-892db2b9e824-logs\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.619860 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4842274-d590-4208-8a18-892db2b9e824-horizon-secret-key\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.620761 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-config-data\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.624708 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4842274-d590-4208-8a18-892db2b9e824-logs\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.625399 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.625494 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.625698 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx4fp\" (UniqueName: \"kubernetes.io/projected/e8653761-102b-4879-ba09-1b263a960052-kube-api-access-hx4fp\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.625599 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-scripts\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.627077 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8653761-102b-4879-ba09-1b263a960052" (UID: "e8653761-102b-4879-ba09-1b263a960052"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.627762 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4842274-d590-4208-8a18-892db2b9e824-horizon-secret-key\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.652128 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-config" (OuterVolumeSpecName: "config") pod "e8653761-102b-4879-ba09-1b263a960052" (UID: "e8653761-102b-4879-ba09-1b263a960052"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.652835 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slw8v\" (UniqueName: \"kubernetes.io/projected/e4842274-d590-4208-8a18-892db2b9e824-kube-api-access-slw8v\") pod \"horizon-6f5d4896d9-72pgt\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.727378 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.727409 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.853991 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8653761-102b-4879-ba09-1b263a960052" (UID: "e8653761-102b-4879-ba09-1b263a960052"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:36:32 crc kubenswrapper[4658]: I1002 11:36:32.937512 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8653761-102b-4879-ba09-1b263a960052-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.011345 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.205479 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-57zdb"] Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.213952 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-57zdb"] Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.561540 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f5d4896d9-72pgt"] Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.588862 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jftqc" event={"ID":"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab","Type":"ContainerStarted","Data":"806bda466dc507d954b7c5a64bd6585e9b0914e91020581cda090db9bef02e16"} Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.591362 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" event={"ID":"f667e839-3159-487f-af95-60818fdc1b84","Type":"ContainerStarted","Data":"5795a2ce92c833a010aaac209d7283d07ce29363ddab019311b5845fda1ba01a"} Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.601601 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-ml5sj" event={"ID":"efa1ebca-0cdd-4bce-adf2-e8273c3448f1","Type":"ContainerStarted","Data":"29374d3027afadc1219abc9cf8b80cf6e2cc79d695109e13531f4660ad1f4722"} Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.605861 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerStarted","Data":"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81"} Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.605901 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerStarted","Data":"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687"} Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.609941 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jftqc" podStartSLOduration=4.609903387 podStartE2EDuration="4.609903387s" podCreationTimestamp="2025-10-02 11:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:36:33.608351277 +0000 UTC m=+1074.499504854" watchObservedRunningTime="2025-10-02 11:36:33.609903387 +0000 UTC m=+1074.501056954" Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.626065 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-ml5sj" podStartSLOduration=4.052964829 podStartE2EDuration="39.626047704s" podCreationTimestamp="2025-10-02 11:35:54 +0000 UTC" firstStartedPulling="2025-10-02 11:35:55.927518246 +0000 UTC m=+1036.818671813" lastFinishedPulling="2025-10-02 11:36:31.500601121 +0000 UTC m=+1072.391754688" observedRunningTime="2025-10-02 11:36:33.624016188 +0000 UTC m=+1074.515169745" watchObservedRunningTime="2025-10-02 11:36:33.626047704 +0000 UTC m=+1074.517201271" Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.628318 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kbvvc" event={"ID":"87a291e0-0291-4591-8d80-818338d6ae2d","Type":"ContainerStarted","Data":"b815b56b43296362dc4f3470f3d7e8ef1d65ff3d6f6ed7a1580287738ab2e409"} Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.676962 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.676935881 podStartE2EDuration="14.676935881s" podCreationTimestamp="2025-10-02 11:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:36:33.65376461 +0000 UTC m=+1074.544918177" watchObservedRunningTime="2025-10-02 11:36:33.676935881 +0000 UTC m=+1074.568089468" Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.686408 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-kbvvc" podStartSLOduration=4.617303321 podStartE2EDuration="41.686384633s" podCreationTimestamp="2025-10-02 11:35:52 +0000 UTC" firstStartedPulling="2025-10-02 11:35:54.055078098 +0000 UTC m=+1034.946231655" lastFinishedPulling="2025-10-02 11:36:31.1241594 +0000 UTC m=+1072.015312967" observedRunningTime="2025-10-02 11:36:33.671027032 +0000 UTC m=+1074.562180599" watchObservedRunningTime="2025-10-02 11:36:33.686384633 +0000 UTC m=+1074.577538200" Oct 02 11:36:33 crc kubenswrapper[4658]: I1002 11:36:33.967711 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8653761-102b-4879-ba09-1b263a960052" path="/var/lib/kubelet/pods/e8653761-102b-4879-ba09-1b263a960052/volumes" Oct 02 11:36:34 crc kubenswrapper[4658]: I1002 11:36:34.611544 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:34 crc kubenswrapper[4658]: I1002 11:36:34.612670 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:34 crc kubenswrapper[4658]: I1002 11:36:34.618455 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:34 crc kubenswrapper[4658]: I1002 11:36:34.647423 4658 generic.go:334] "Generic (PLEG): container finished" podID="f667e839-3159-487f-af95-60818fdc1b84" containerID="5795a2ce92c833a010aaac209d7283d07ce29363ddab019311b5845fda1ba01a" exitCode=0 Oct 02 11:36:34 crc kubenswrapper[4658]: I1002 11:36:34.647611 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" event={"ID":"f667e839-3159-487f-af95-60818fdc1b84","Type":"ContainerDied","Data":"5795a2ce92c833a010aaac209d7283d07ce29363ddab019311b5845fda1ba01a"} Oct 02 11:36:34 crc kubenswrapper[4658]: I1002 11:36:34.680332 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5d4896d9-72pgt" event={"ID":"e4842274-d590-4208-8a18-892db2b9e824","Type":"ContainerStarted","Data":"cdb8052acba5879ff1e2fc1b08b7a0d9e441db12da6782c7778ee56d87772d08"} Oct 02 11:36:34 crc kubenswrapper[4658]: I1002 11:36:34.732641 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 02 11:36:35 crc kubenswrapper[4658]: I1002 11:36:35.743414 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" event={"ID":"f667e839-3159-487f-af95-60818fdc1b84","Type":"ContainerStarted","Data":"1f75b96569cb9daf5e807e6a5ae8e9e362f122f5c378832e24dbe4139b2b82a2"} Oct 02 11:36:35 crc kubenswrapper[4658]: I1002 11:36:35.743845 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:35 crc kubenswrapper[4658]: I1002 11:36:35.772980 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" podStartSLOduration=6.772961373 podStartE2EDuration="6.772961373s" podCreationTimestamp="2025-10-02 11:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:36:35.764833093 +0000 UTC m=+1076.655986660" watchObservedRunningTime="2025-10-02 11:36:35.772961373 +0000 UTC m=+1076.664114940" Oct 02 11:36:37 crc kubenswrapper[4658]: I1002 11:36:37.781017 4658 generic.go:334] "Generic (PLEG): container finished" podID="8797d11d-3098-476c-9273-b62dc97e1558" containerID="8779db10fbef68937802a29919f9f9dd2cbddf39104c74c0fdd50ee885a5ca23" exitCode=0 Oct 02 11:36:37 crc kubenswrapper[4658]: I1002 11:36:37.781620 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sqr7t" event={"ID":"8797d11d-3098-476c-9273-b62dc97e1558","Type":"ContainerDied","Data":"8779db10fbef68937802a29919f9f9dd2cbddf39104c74c0fdd50ee885a5ca23"} Oct 02 11:36:38 crc kubenswrapper[4658]: I1002 11:36:38.821282 4658 generic.go:334] "Generic (PLEG): container finished" podID="efa1ebca-0cdd-4bce-adf2-e8273c3448f1" containerID="29374d3027afadc1219abc9cf8b80cf6e2cc79d695109e13531f4660ad1f4722" exitCode=0 Oct 02 11:36:38 crc kubenswrapper[4658]: I1002 11:36:38.821778 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-ml5sj" event={"ID":"efa1ebca-0cdd-4bce-adf2-e8273c3448f1","Type":"ContainerDied","Data":"29374d3027afadc1219abc9cf8b80cf6e2cc79d695109e13531f4660ad1f4722"} Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.167373 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78b685455c-5zn4s"] Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.198721 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6dbf7b8b8b-kj6xr"] Oct 02 11:36:39 crc kubenswrapper[4658]: E1002 11:36:39.199462 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8653761-102b-4879-ba09-1b263a960052" containerName="init" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.199570 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8653761-102b-4879-ba09-1b263a960052" containerName="init" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.199884 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8653761-102b-4879-ba09-1b263a960052" containerName="init" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.202356 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.209543 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.220002 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dbf7b8b8b-kj6xr"] Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.238137 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f5d4896d9-72pgt"] Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.250720 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-776f4bfd7b-cm7vj"] Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.253030 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.268209 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-776f4bfd7b-cm7vj"] Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313185 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-config-data\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313255 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-scripts\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313280 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-logs\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313503 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-tls-certs\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313566 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-combined-ca-bundle\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313588 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-secret-key\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313615 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-combined-ca-bundle\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313664 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-logs\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313685 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7ch4\" (UniqueName: \"kubernetes.io/projected/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-kube-api-access-c7ch4\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313700 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-horizon-tls-certs\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.313746 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-config-data\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.314109 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-scripts\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.314131 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrk2z\" (UniqueName: \"kubernetes.io/projected/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-kube-api-access-zrk2z\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.314167 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-horizon-secret-key\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.415933 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-combined-ca-bundle\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416007 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-logs\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416037 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7ch4\" (UniqueName: \"kubernetes.io/projected/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-kube-api-access-c7ch4\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416055 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-horizon-tls-certs\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416091 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-config-data\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416162 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-scripts\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416187 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrk2z\" (UniqueName: \"kubernetes.io/projected/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-kube-api-access-zrk2z\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416208 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-horizon-secret-key\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416282 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-config-data\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416365 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-scripts\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416395 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-logs\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416431 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-tls-certs\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416463 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-combined-ca-bundle\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416489 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-secret-key\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.416528 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-logs\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.417953 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-scripts\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.418331 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-config-data\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.418395 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-config-data\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.418513 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-scripts\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.418626 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-logs\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.430075 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-combined-ca-bundle\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.430206 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-secret-key\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.430479 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-horizon-tls-certs\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.432416 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-horizon-secret-key\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.434973 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7ch4\" (UniqueName: \"kubernetes.io/projected/02408c48-14d8-4a7b-8ebf-79fd2fa1b924-kube-api-access-c7ch4\") pod \"horizon-776f4bfd7b-cm7vj\" (UID: \"02408c48-14d8-4a7b-8ebf-79fd2fa1b924\") " pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.435934 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrk2z\" (UniqueName: \"kubernetes.io/projected/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-kube-api-access-zrk2z\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.443504 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-tls-certs\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.465233 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-combined-ca-bundle\") pod \"horizon-6dbf7b8b8b-kj6xr\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.542860 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:36:39 crc kubenswrapper[4658]: I1002 11:36:39.595905 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:36:40 crc kubenswrapper[4658]: I1002 11:36:40.386364 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:36:40 crc kubenswrapper[4658]: I1002 11:36:40.444968 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-79xjk"] Oct 02 11:36:40 crc kubenswrapper[4658]: I1002 11:36:40.445254 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="dnsmasq-dns" containerID="cri-o://410c6bd0ab23ea069efbd18fc4d6a500ef5ea2e6a88fb91ae89506b18ba894a8" gracePeriod=10 Oct 02 11:36:40 crc kubenswrapper[4658]: I1002 11:36:40.845465 4658 generic.go:334] "Generic (PLEG): container finished" podID="8145034e-a3f8-413f-b306-26f4b240bd85" containerID="410c6bd0ab23ea069efbd18fc4d6a500ef5ea2e6a88fb91ae89506b18ba894a8" exitCode=0 Oct 02 11:36:40 crc kubenswrapper[4658]: I1002 11:36:40.845546 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" event={"ID":"8145034e-a3f8-413f-b306-26f4b240bd85","Type":"ContainerDied","Data":"410c6bd0ab23ea069efbd18fc4d6a500ef5ea2e6a88fb91ae89506b18ba894a8"} Oct 02 11:36:43 crc kubenswrapper[4658]: I1002 11:36:43.335190 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.550946 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.578089 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-db-sync-config-data\") pod \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.578158 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-combined-ca-bundle\") pod \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.578214 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-config-data\") pod \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.578256 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwqhk\" (UniqueName: \"kubernetes.io/projected/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-kube-api-access-kwqhk\") pod \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\" (UID: \"efa1ebca-0cdd-4bce-adf2-e8273c3448f1\") " Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.584850 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-kube-api-access-kwqhk" (OuterVolumeSpecName: "kube-api-access-kwqhk") pod "efa1ebca-0cdd-4bce-adf2-e8273c3448f1" (UID: "efa1ebca-0cdd-4bce-adf2-e8273c3448f1"). InnerVolumeSpecName "kube-api-access-kwqhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.608221 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "efa1ebca-0cdd-4bce-adf2-e8273c3448f1" (UID: "efa1ebca-0cdd-4bce-adf2-e8273c3448f1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.614586 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efa1ebca-0cdd-4bce-adf2-e8273c3448f1" (UID: "efa1ebca-0cdd-4bce-adf2-e8273c3448f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.634490 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-config-data" (OuterVolumeSpecName: "config-data") pod "efa1ebca-0cdd-4bce-adf2-e8273c3448f1" (UID: "efa1ebca-0cdd-4bce-adf2-e8273c3448f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.681131 4658 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.681185 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.681194 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.681207 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwqhk\" (UniqueName: \"kubernetes.io/projected/efa1ebca-0cdd-4bce-adf2-e8273c3448f1-kube-api-access-kwqhk\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.905865 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-ml5sj" event={"ID":"efa1ebca-0cdd-4bce-adf2-e8273c3448f1","Type":"ContainerDied","Data":"e151742b21a554e9b6ebe7aec369c327e35d35d87bb814e5635ffeb572d76313"} Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.905911 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e151742b21a554e9b6ebe7aec369c327e35d35d87bb814e5635ffeb572d76313" Oct 02 11:36:46 crc kubenswrapper[4658]: I1002 11:36:46.905921 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-ml5sj" Oct 02 11:36:47 crc kubenswrapper[4658]: I1002 11:36:47.999362 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 02 11:36:48 crc kubenswrapper[4658]: E1002 11:36:48.000875 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa1ebca-0cdd-4bce-adf2-e8273c3448f1" containerName="watcher-db-sync" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.000978 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa1ebca-0cdd-4bce-adf2-e8273c3448f1" containerName="watcher-db-sync" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.001267 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa1ebca-0cdd-4bce-adf2-e8273c3448f1" containerName="watcher-db-sync" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.002143 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.013750 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-7wq4z" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.014249 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.016994 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.075206 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.084405 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.095248 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.104759 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.125932 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba2292e-4150-4a9d-9b22-49482e381c6c-config-data\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.126090 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba2292e-4150-4a9d-9b22-49482e381c6c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.126184 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcjgw\" (UniqueName: \"kubernetes.io/projected/dba2292e-4150-4a9d-9b22-49482e381c6c-kube-api-access-hcjgw\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.126259 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dba2292e-4150-4a9d-9b22-49482e381c6c-logs\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.166664 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.168022 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.169855 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.208462 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227558 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba2292e-4150-4a9d-9b22-49482e381c6c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227609 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-logs\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227640 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227665 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rgkd\" (UniqueName: \"kubernetes.io/projected/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-kube-api-access-7rgkd\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227707 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcjgw\" (UniqueName: \"kubernetes.io/projected/dba2292e-4150-4a9d-9b22-49482e381c6c-kube-api-access-hcjgw\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227737 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227768 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-config-data\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227796 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dba2292e-4150-4a9d-9b22-49482e381c6c-logs\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227831 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-config-data\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227874 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227914 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba2292e-4150-4a9d-9b22-49482e381c6c-config-data\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227950 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftk6v\" (UniqueName: \"kubernetes.io/projected/74fa8060-b33d-406a-aaa0-386d23c8532b-kube-api-access-ftk6v\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.227980 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.228028 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74fa8060-b33d-406a-aaa0-386d23c8532b-logs\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.229873 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dba2292e-4150-4a9d-9b22-49482e381c6c-logs\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.244526 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba2292e-4150-4a9d-9b22-49482e381c6c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.250903 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcjgw\" (UniqueName: \"kubernetes.io/projected/dba2292e-4150-4a9d-9b22-49482e381c6c-kube-api-access-hcjgw\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.259641 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba2292e-4150-4a9d-9b22-49482e381c6c-config-data\") pod \"watcher-applier-0\" (UID: \"dba2292e-4150-4a9d-9b22-49482e381c6c\") " pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330124 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330208 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftk6v\" (UniqueName: \"kubernetes.io/projected/74fa8060-b33d-406a-aaa0-386d23c8532b-kube-api-access-ftk6v\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330241 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330279 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74fa8060-b33d-406a-aaa0-386d23c8532b-logs\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330340 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-logs\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330369 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330395 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rgkd\" (UniqueName: \"kubernetes.io/projected/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-kube-api-access-7rgkd\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330441 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330471 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-config-data\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.330506 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-config-data\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.338026 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-config-data\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.339495 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.340163 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-logs\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.341230 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74fa8060-b33d-406a-aaa0-386d23c8532b-logs\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.342995 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.344630 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.348277 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.348836 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.356109 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-config-data\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.373132 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.373823 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rgkd\" (UniqueName: \"kubernetes.io/projected/34ba94d4-e1db-40a9-93e7-5a4e053ae8db-kube-api-access-7rgkd\") pod \"watcher-decision-engine-0\" (UID: \"34ba94d4-e1db-40a9-93e7-5a4e053ae8db\") " pod="openstack/watcher-decision-engine-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.375569 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftk6v\" (UniqueName: \"kubernetes.io/projected/74fa8060-b33d-406a-aaa0-386d23c8532b-kube-api-access-ftk6v\") pod \"watcher-api-0\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.434644 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 02 11:36:48 crc kubenswrapper[4658]: I1002 11:36:48.511846 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.043755 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.044410 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-47s9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-9hqkv_openstack(a4602160-442e-4a87-bacb-3493da6f4dad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.046403 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-9hqkv" podUID="a4602160-442e-4a87-bacb-3493da6f4dad" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.085743 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.086080 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb8h568h599hb8h5h5f6h65dh5d5hf4h57h7dh5cbh59fh568h549h665h54dh5bch5cch68bh677h5d7h59hfdh688h555h66ch7bhffh549h686h69q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmnlr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-77b58fd99f-hfqr9_openstack(d6380c8f-2c75-46e1-b055-80c6f4ecdde5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.089634 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-77b58fd99f-hfqr9" podUID="d6380c8f-2c75-46e1-b055-80c6f4ecdde5" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.108848 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.109043 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f7h649h696h55dh594h59h5dbh5c9h544hf7h55ch5dbh5ch5fbh559h6dh76hf5h7ch85h686h79h589h56hd9h596hb9h5d4h9ch57fh646h56bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-slw8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6f5d4896d9-72pgt_openstack(e4842274-d590-4208-8a18-892db2b9e824): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.111504 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6f5d4896d9-72pgt" podUID="e4842274-d590-4208-8a18-892db2b9e824" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.138744 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.284690 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46ph5\" (UniqueName: \"kubernetes.io/projected/8797d11d-3098-476c-9273-b62dc97e1558-kube-api-access-46ph5\") pod \"8797d11d-3098-476c-9273-b62dc97e1558\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.284810 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-credential-keys\") pod \"8797d11d-3098-476c-9273-b62dc97e1558\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.284857 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-scripts\") pod \"8797d11d-3098-476c-9273-b62dc97e1558\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.284898 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-fernet-keys\") pod \"8797d11d-3098-476c-9273-b62dc97e1558\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.284927 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-config-data\") pod \"8797d11d-3098-476c-9273-b62dc97e1558\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.284967 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-combined-ca-bundle\") pod \"8797d11d-3098-476c-9273-b62dc97e1558\" (UID: \"8797d11d-3098-476c-9273-b62dc97e1558\") " Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.289696 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-scripts" (OuterVolumeSpecName: "scripts") pod "8797d11d-3098-476c-9273-b62dc97e1558" (UID: "8797d11d-3098-476c-9273-b62dc97e1558"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.290338 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8797d11d-3098-476c-9273-b62dc97e1558" (UID: "8797d11d-3098-476c-9273-b62dc97e1558"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.290856 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8797d11d-3098-476c-9273-b62dc97e1558" (UID: "8797d11d-3098-476c-9273-b62dc97e1558"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.294828 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8797d11d-3098-476c-9273-b62dc97e1558-kube-api-access-46ph5" (OuterVolumeSpecName: "kube-api-access-46ph5") pod "8797d11d-3098-476c-9273-b62dc97e1558" (UID: "8797d11d-3098-476c-9273-b62dc97e1558"). InnerVolumeSpecName "kube-api-access-46ph5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.315592 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-config-data" (OuterVolumeSpecName: "config-data") pod "8797d11d-3098-476c-9273-b62dc97e1558" (UID: "8797d11d-3098-476c-9273-b62dc97e1558"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.338930 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8797d11d-3098-476c-9273-b62dc97e1558" (UID: "8797d11d-3098-476c-9273-b62dc97e1558"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.387469 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46ph5\" (UniqueName: \"kubernetes.io/projected/8797d11d-3098-476c-9273-b62dc97e1558-kube-api-access-46ph5\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.387502 4658 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.387513 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.387521 4658 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.387530 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.387538 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8797d11d-3098-476c-9273-b62dc97e1558-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.985594 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sqr7t" Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.985591 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sqr7t" event={"ID":"8797d11d-3098-476c-9273-b62dc97e1558","Type":"ContainerDied","Data":"76dd9d2608d830804848f507ca7b78cf13f1fde43c9e9a75544b51fa25b3b7b0"} Oct 02 11:36:51 crc kubenswrapper[4658]: I1002 11:36:51.986163 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76dd9d2608d830804848f507ca7b78cf13f1fde43c9e9a75544b51fa25b3b7b0" Oct 02 11:36:51 crc kubenswrapper[4658]: E1002 11:36:51.987243 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-9hqkv" podUID="a4602160-442e-4a87-bacb-3493da6f4dad" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.237367 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sqr7t"] Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.249255 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sqr7t"] Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.323186 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dc6wn"] Oct 02 11:36:52 crc kubenswrapper[4658]: E1002 11:36:52.323716 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8797d11d-3098-476c-9273-b62dc97e1558" containerName="keystone-bootstrap" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.323733 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8797d11d-3098-476c-9273-b62dc97e1558" containerName="keystone-bootstrap" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.323978 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="8797d11d-3098-476c-9273-b62dc97e1558" containerName="keystone-bootstrap" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.325709 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.327761 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.329939 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bltpc" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.330162 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.332598 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.342182 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dc6wn"] Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.403226 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-config-data\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.403279 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-credential-keys\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.403352 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-scripts\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.403460 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-combined-ca-bundle\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.403527 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-fernet-keys\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.403694 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkj4z\" (UniqueName: \"kubernetes.io/projected/916133b3-3541-40ec-b32a-4b8bf4870d7f-kube-api-access-rkj4z\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.505104 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-config-data\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.505471 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-credential-keys\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.505511 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-scripts\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.505541 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-combined-ca-bundle\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.505564 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-fernet-keys\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.505621 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkj4z\" (UniqueName: \"kubernetes.io/projected/916133b3-3541-40ec-b32a-4b8bf4870d7f-kube-api-access-rkj4z\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.510330 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-fernet-keys\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.510347 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-combined-ca-bundle\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.510915 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-config-data\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.511012 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-credential-keys\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.512229 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-scripts\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.524984 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkj4z\" (UniqueName: \"kubernetes.io/projected/916133b3-3541-40ec-b32a-4b8bf4870d7f-kube-api-access-rkj4z\") pod \"keystone-bootstrap-dc6wn\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:52 crc kubenswrapper[4658]: I1002 11:36:52.665618 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:36:53 crc kubenswrapper[4658]: I1002 11:36:53.962518 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8797d11d-3098-476c-9273-b62dc97e1558" path="/var/lib/kubelet/pods/8797d11d-3098-476c-9273-b62dc97e1558/volumes" Oct 02 11:36:55 crc kubenswrapper[4658]: I1002 11:36:55.012585 4658 generic.go:334] "Generic (PLEG): container finished" podID="87a291e0-0291-4591-8d80-818338d6ae2d" containerID="b815b56b43296362dc4f3470f3d7e8ef1d65ff3d6f6ed7a1580287738ab2e409" exitCode=0 Oct 02 11:36:55 crc kubenswrapper[4658]: I1002 11:36:55.012885 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kbvvc" event={"ID":"87a291e0-0291-4591-8d80-818338d6ae2d","Type":"ContainerDied","Data":"b815b56b43296362dc4f3470f3d7e8ef1d65ff3d6f6ed7a1580287738ab2e409"} Oct 02 11:36:58 crc kubenswrapper[4658]: I1002 11:36:58.336352 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Oct 02 11:36:58 crc kubenswrapper[4658]: I1002 11:36:58.338863 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:37:03 crc kubenswrapper[4658]: I1002 11:37:03.338358 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.871893 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.879771 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.943813 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-logs\") pod \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.943908 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-config-data\") pod \"e4842274-d590-4208-8a18-892db2b9e824\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944040 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-horizon-secret-key\") pod \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944117 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4842274-d590-4208-8a18-892db2b9e824-horizon-secret-key\") pod \"e4842274-d590-4208-8a18-892db2b9e824\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944204 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmnlr\" (UniqueName: \"kubernetes.io/projected/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-kube-api-access-mmnlr\") pod \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944280 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slw8v\" (UniqueName: \"kubernetes.io/projected/e4842274-d590-4208-8a18-892db2b9e824-kube-api-access-slw8v\") pod \"e4842274-d590-4208-8a18-892db2b9e824\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944369 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4842274-d590-4208-8a18-892db2b9e824-logs\") pod \"e4842274-d590-4208-8a18-892db2b9e824\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944396 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-logs" (OuterVolumeSpecName: "logs") pod "d6380c8f-2c75-46e1-b055-80c6f4ecdde5" (UID: "d6380c8f-2c75-46e1-b055-80c6f4ecdde5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944424 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-scripts\") pod \"e4842274-d590-4208-8a18-892db2b9e824\" (UID: \"e4842274-d590-4208-8a18-892db2b9e824\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944492 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-scripts\") pod \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944516 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-config-data\") pod \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\" (UID: \"d6380c8f-2c75-46e1-b055-80c6f4ecdde5\") " Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.944654 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-config-data" (OuterVolumeSpecName: "config-data") pod "e4842274-d590-4208-8a18-892db2b9e824" (UID: "e4842274-d590-4208-8a18-892db2b9e824"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.945326 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-scripts" (OuterVolumeSpecName: "scripts") pod "d6380c8f-2c75-46e1-b055-80c6f4ecdde5" (UID: "d6380c8f-2c75-46e1-b055-80c6f4ecdde5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.945459 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4842274-d590-4208-8a18-892db2b9e824-logs" (OuterVolumeSpecName: "logs") pod "e4842274-d590-4208-8a18-892db2b9e824" (UID: "e4842274-d590-4208-8a18-892db2b9e824"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.945588 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-scripts" (OuterVolumeSpecName: "scripts") pod "e4842274-d590-4208-8a18-892db2b9e824" (UID: "e4842274-d590-4208-8a18-892db2b9e824"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.945658 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-config-data" (OuterVolumeSpecName: "config-data") pod "d6380c8f-2c75-46e1-b055-80c6f4ecdde5" (UID: "d6380c8f-2c75-46e1-b055-80c6f4ecdde5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.945812 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4842274-d590-4208-8a18-892db2b9e824-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.945830 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.945841 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.946364 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.946394 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.946405 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4842274-d590-4208-8a18-892db2b9e824-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.950773 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-kube-api-access-mmnlr" (OuterVolumeSpecName: "kube-api-access-mmnlr") pod "d6380c8f-2c75-46e1-b055-80c6f4ecdde5" (UID: "d6380c8f-2c75-46e1-b055-80c6f4ecdde5"). InnerVolumeSpecName "kube-api-access-mmnlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.956618 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4842274-d590-4208-8a18-892db2b9e824-kube-api-access-slw8v" (OuterVolumeSpecName: "kube-api-access-slw8v") pod "e4842274-d590-4208-8a18-892db2b9e824" (UID: "e4842274-d590-4208-8a18-892db2b9e824"). InnerVolumeSpecName "kube-api-access-slw8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.956675 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d6380c8f-2c75-46e1-b055-80c6f4ecdde5" (UID: "d6380c8f-2c75-46e1-b055-80c6f4ecdde5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:04 crc kubenswrapper[4658]: I1002 11:37:04.963035 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4842274-d590-4208-8a18-892db2b9e824-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e4842274-d590-4208-8a18-892db2b9e824" (UID: "e4842274-d590-4208-8a18-892db2b9e824"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.048547 4658 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.048588 4658 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4842274-d590-4208-8a18-892db2b9e824-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.048601 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmnlr\" (UniqueName: \"kubernetes.io/projected/d6380c8f-2c75-46e1-b055-80c6f4ecdde5-kube-api-access-mmnlr\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.048613 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slw8v\" (UniqueName: \"kubernetes.io/projected/e4842274-d590-4208-8a18-892db2b9e824-kube-api-access-slw8v\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.093977 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b58fd99f-hfqr9" event={"ID":"d6380c8f-2c75-46e1-b055-80c6f4ecdde5","Type":"ContainerDied","Data":"607c7e9aca50e975a19e64f25fd8e4bebf9f3c1557404b65595f2b414a59f3a3"} Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.094014 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b58fd99f-hfqr9" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.095070 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5d4896d9-72pgt" event={"ID":"e4842274-d590-4208-8a18-892db2b9e824","Type":"ContainerDied","Data":"cdb8052acba5879ff1e2fc1b08b7a0d9e441db12da6782c7778ee56d87772d08"} Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.095137 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5d4896d9-72pgt" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.179696 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77b58fd99f-hfqr9"] Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.196596 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77b58fd99f-hfqr9"] Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.213333 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f5d4896d9-72pgt"] Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.221882 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f5d4896d9-72pgt"] Oct 02 11:37:05 crc kubenswrapper[4658]: E1002 11:37:05.486142 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 02 11:37:05 crc kubenswrapper[4658]: E1002 11:37:05.486324 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-522tx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-d5ppn_openstack(057d8045-79f8-4f4d-9b29-ce1f517e0f94): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:37:05 crc kubenswrapper[4658]: E1002 11:37:05.487394 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-d5ppn" podUID="057d8045-79f8-4f4d-9b29-ce1f517e0f94" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.593633 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.604902 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kbvvc" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674130 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-nb\") pod \"8145034e-a3f8-413f-b306-26f4b240bd85\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674286 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-svc\") pod \"8145034e-a3f8-413f-b306-26f4b240bd85\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674372 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-combined-ca-bundle\") pod \"87a291e0-0291-4591-8d80-818338d6ae2d\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674416 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-swift-storage-0\") pod \"8145034e-a3f8-413f-b306-26f4b240bd85\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674441 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-sb\") pod \"8145034e-a3f8-413f-b306-26f4b240bd85\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674485 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqzz7\" (UniqueName: \"kubernetes.io/projected/87a291e0-0291-4591-8d80-818338d6ae2d-kube-api-access-vqzz7\") pod \"87a291e0-0291-4591-8d80-818338d6ae2d\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674565 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-config-data\") pod \"87a291e0-0291-4591-8d80-818338d6ae2d\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674591 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-db-sync-config-data\") pod \"87a291e0-0291-4591-8d80-818338d6ae2d\" (UID: \"87a291e0-0291-4591-8d80-818338d6ae2d\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674886 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-config\") pod \"8145034e-a3f8-413f-b306-26f4b240bd85\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.674915 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sjbx\" (UniqueName: \"kubernetes.io/projected/8145034e-a3f8-413f-b306-26f4b240bd85-kube-api-access-9sjbx\") pod \"8145034e-a3f8-413f-b306-26f4b240bd85\" (UID: \"8145034e-a3f8-413f-b306-26f4b240bd85\") " Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.679434 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8145034e-a3f8-413f-b306-26f4b240bd85-kube-api-access-9sjbx" (OuterVolumeSpecName: "kube-api-access-9sjbx") pod "8145034e-a3f8-413f-b306-26f4b240bd85" (UID: "8145034e-a3f8-413f-b306-26f4b240bd85"). InnerVolumeSpecName "kube-api-access-9sjbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.680328 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "87a291e0-0291-4591-8d80-818338d6ae2d" (UID: "87a291e0-0291-4591-8d80-818338d6ae2d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.681649 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a291e0-0291-4591-8d80-818338d6ae2d-kube-api-access-vqzz7" (OuterVolumeSpecName: "kube-api-access-vqzz7") pod "87a291e0-0291-4591-8d80-818338d6ae2d" (UID: "87a291e0-0291-4591-8d80-818338d6ae2d"). InnerVolumeSpecName "kube-api-access-vqzz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.706346 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87a291e0-0291-4591-8d80-818338d6ae2d" (UID: "87a291e0-0291-4591-8d80-818338d6ae2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.721476 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-config-data" (OuterVolumeSpecName: "config-data") pod "87a291e0-0291-4591-8d80-818338d6ae2d" (UID: "87a291e0-0291-4591-8d80-818338d6ae2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.722597 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8145034e-a3f8-413f-b306-26f4b240bd85" (UID: "8145034e-a3f8-413f-b306-26f4b240bd85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.724743 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8145034e-a3f8-413f-b306-26f4b240bd85" (UID: "8145034e-a3f8-413f-b306-26f4b240bd85"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.730246 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8145034e-a3f8-413f-b306-26f4b240bd85" (UID: "8145034e-a3f8-413f-b306-26f4b240bd85"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.732680 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8145034e-a3f8-413f-b306-26f4b240bd85" (UID: "8145034e-a3f8-413f-b306-26f4b240bd85"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.741436 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-config" (OuterVolumeSpecName: "config") pod "8145034e-a3f8-413f-b306-26f4b240bd85" (UID: "8145034e-a3f8-413f-b306-26f4b240bd85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.788994 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.789033 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.789047 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.789058 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.789068 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqzz7\" (UniqueName: \"kubernetes.io/projected/87a291e0-0291-4591-8d80-818338d6ae2d-kube-api-access-vqzz7\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.789076 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.789085 4658 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87a291e0-0291-4591-8d80-818338d6ae2d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.789094 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.789104 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sjbx\" (UniqueName: \"kubernetes.io/projected/8145034e-a3f8-413f-b306-26f4b240bd85-kube-api-access-9sjbx\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.789112 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8145034e-a3f8-413f-b306-26f4b240bd85-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.972004 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6380c8f-2c75-46e1-b055-80c6f4ecdde5" path="/var/lib/kubelet/pods/d6380c8f-2c75-46e1-b055-80c6f4ecdde5/volumes" Oct 02 11:37:05 crc kubenswrapper[4658]: I1002 11:37:05.972627 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4842274-d590-4208-8a18-892db2b9e824" path="/var/lib/kubelet/pods/e4842274-d590-4208-8a18-892db2b9e824/volumes" Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.128821 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kbvvc" event={"ID":"87a291e0-0291-4591-8d80-818338d6ae2d","Type":"ContainerDied","Data":"673530aac1ade61da3eea1822e295666130bc59d75920c46754e937a4f78eccd"} Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.128872 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="673530aac1ade61da3eea1822e295666130bc59d75920c46754e937a4f78eccd" Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.128908 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kbvvc" Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.130333 4658 generic.go:334] "Generic (PLEG): container finished" podID="8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab" containerID="806bda466dc507d954b7c5a64bd6585e9b0914e91020581cda090db9bef02e16" exitCode=0 Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.130368 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jftqc" event={"ID":"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab","Type":"ContainerDied","Data":"806bda466dc507d954b7c5a64bd6585e9b0914e91020581cda090db9bef02e16"} Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.134434 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" event={"ID":"8145034e-a3f8-413f-b306-26f4b240bd85","Type":"ContainerDied","Data":"41ec6a4251d87763fba7c1a1723bae95ffe66d7e0f5d5fb8907fd204db52f60f"} Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.134478 4658 scope.go:117] "RemoveContainer" containerID="410c6bd0ab23ea069efbd18fc4d6a500ef5ea2e6a88fb91ae89506b18ba894a8" Oct 02 11:37:06 crc kubenswrapper[4658]: E1002 11:37:06.135457 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-d5ppn" podUID="057d8045-79f8-4f4d-9b29-ce1f517e0f94" Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.140206 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.184869 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-79xjk"] Oct 02 11:37:06 crc kubenswrapper[4658]: I1002 11:37:06.193553 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-79xjk"] Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.102730 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rrjlk"] Oct 02 11:37:07 crc kubenswrapper[4658]: E1002 11:37:07.103116 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="dnsmasq-dns" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.103129 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="dnsmasq-dns" Oct 02 11:37:07 crc kubenswrapper[4658]: E1002 11:37:07.103142 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a291e0-0291-4591-8d80-818338d6ae2d" containerName="glance-db-sync" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.103148 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a291e0-0291-4591-8d80-818338d6ae2d" containerName="glance-db-sync" Oct 02 11:37:07 crc kubenswrapper[4658]: E1002 11:37:07.103172 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="init" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.103178 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="init" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.103359 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="dnsmasq-dns" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.103373 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a291e0-0291-4591-8d80-818338d6ae2d" containerName="glance-db-sync" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.104836 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.147368 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rrjlk"] Oct 02 11:37:07 crc kubenswrapper[4658]: E1002 11:37:07.169961 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 02 11:37:07 crc kubenswrapper[4658]: E1002 11:37:07.172083 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lf8r5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-s6w77_openstack(6378c687-5c50-4efd-8cc5-b7aa4ef82297): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:37:07 crc kubenswrapper[4658]: E1002 11:37:07.174441 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-s6w77" podUID="6378c687-5c50-4efd-8cc5-b7aa4ef82297" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.227087 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4gv9\" (UniqueName: \"kubernetes.io/projected/23ed6974-6382-47c3-89b3-6eef13014502-kube-api-access-h4gv9\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.227161 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.227180 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.227199 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.227234 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.227267 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-config\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.273603 4658 scope.go:117] "RemoveContainer" containerID="c955af8be2de8d584b337d64425bd2f2c67abe21965841e50a36c5782c1a4f14" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.329110 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.329168 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.329228 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.329274 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-config\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.329424 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4gv9\" (UniqueName: \"kubernetes.io/projected/23ed6974-6382-47c3-89b3-6eef13014502-kube-api-access-h4gv9\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.329505 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.330846 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.332074 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-config\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.332877 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.333454 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.333562 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.358698 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4gv9\" (UniqueName: \"kubernetes.io/projected/23ed6974-6382-47c3-89b3-6eef13014502-kube-api-access-h4gv9\") pod \"dnsmasq-dns-8b5c85b87-rrjlk\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.504633 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.786999 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jftqc" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.863772 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.954050 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-combined-ca-bundle\") pod \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.954454 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k96sl\" (UniqueName: \"kubernetes.io/projected/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-kube-api-access-k96sl\") pod \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.954669 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-config\") pod \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\" (UID: \"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab\") " Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.960012 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-kube-api-access-k96sl" (OuterVolumeSpecName: "kube-api-access-k96sl") pod "8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab" (UID: "8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab"). InnerVolumeSpecName "kube-api-access-k96sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:07 crc kubenswrapper[4658]: I1002 11:37:07.978681 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" path="/var/lib/kubelet/pods/8145034e-a3f8-413f-b306-26f4b240bd85/volumes" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.006636 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab" (UID: "8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.030429 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-config" (OuterVolumeSpecName: "config") pod "8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab" (UID: "8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.063734 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.063771 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.063783 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k96sl\" (UniqueName: \"kubernetes.io/projected/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab-kube-api-access-k96sl\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.173925 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.174150 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dbf7b8b8b-kj6xr"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.174161 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-776f4bfd7b-cm7vj"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.174174 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.174187 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rrjlk"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.174200 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:37:08 crc kubenswrapper[4658]: E1002 11:37:08.175186 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab" containerName="neutron-db-sync" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.175202 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab" containerName="neutron-db-sync" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.175411 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab" containerName="neutron-db-sync" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.176425 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.198221 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t7vc6" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.199041 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.199316 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.218545 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.226411 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"dba2292e-4150-4a9d-9b22-49482e381c6c","Type":"ContainerStarted","Data":"6d42d5afe9fc46c48357480a25602318dc8d9675df09393341723a72a5c40574"} Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.240761 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"34ba94d4-e1db-40a9-93e7-5a4e053ae8db","Type":"ContainerStarted","Data":"75f68e968978447bd91e4876b411ea4faec5147c2cdaca546dcb0e6d018ef21b"} Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.246557 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dbf7b8b8b-kj6xr" event={"ID":"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2","Type":"ContainerStarted","Data":"4742d9384407970bd1c5bec11bf52283c83e90ec7fee62e6559d9b338d3bd304"} Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.286976 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dc6wn"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.287019 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-776f4bfd7b-cm7vj" event={"ID":"02408c48-14d8-4a7b-8ebf-79fd2fa1b924","Type":"ContainerStarted","Data":"2aafa0200d2f8362ccfd285aca340a2a7f767f0b61bc4b5c9f632d0a9a4eacc8"} Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.312072 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerStarted","Data":"216113a97fe44bd440b15a58eef28be0f657c4645f1843967f36267fbdae5183"} Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.319687 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78b685455c-5zn4s" event={"ID":"822259c6-fea2-44cb-9a09-d6415a92e71e","Type":"ContainerStarted","Data":"92c61fede454e4d930a8b2a7c7439bd70f8ca71bee1ff88d0510a17803277073"} Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.320338 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-config-data\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.320583 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.320628 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.320685 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-scripts\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.320719 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-logs\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.320744 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.320791 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczdx\" (UniqueName: \"kubernetes.io/projected/402b6de2-fa43-4e17-abe7-3af33d08694a-kube-api-access-hczdx\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.326754 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9hqkv" event={"ID":"a4602160-442e-4a87-bacb-3493da6f4dad","Type":"ContainerStarted","Data":"2903d0a3f21ea87cd2d81cd948f6918ca802a6305e105e94d14a48da913c5027"} Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.339539 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-79xjk" podUID="8145034e-a3f8-413f-b306-26f4b240bd85" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.384274 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jftqc" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.384380 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jftqc" event={"ID":"8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab","Type":"ContainerDied","Data":"4a81197aa9c95a844236de94b8cdef32aca5f00b8f300721eff81d37ae62d63a"} Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.384422 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a81197aa9c95a844236de94b8cdef32aca5f00b8f300721eff81d37ae62d63a" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.395820 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.448000 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-9hqkv" podStartSLOduration=4.295033375 podStartE2EDuration="39.44798375s" podCreationTimestamp="2025-10-02 11:36:29 +0000 UTC" firstStartedPulling="2025-10-02 11:36:32.178851554 +0000 UTC m=+1073.070005121" lastFinishedPulling="2025-10-02 11:37:07.331801929 +0000 UTC m=+1108.222955496" observedRunningTime="2025-10-02 11:37:08.355219463 +0000 UTC m=+1109.246373030" watchObservedRunningTime="2025-10-02 11:37:08.44798375 +0000 UTC m=+1109.339137317" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.448758 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.448260 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.448904 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.448931 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-scripts\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.448949 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-logs\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.449192 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.449229 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hczdx\" (UniqueName: \"kubernetes.io/projected/402b6de2-fa43-4e17-abe7-3af33d08694a-kube-api-access-hczdx\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.449307 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-config-data\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.449692 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.454071 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-logs\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.462414 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.485655 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"74fa8060-b33d-406a-aaa0-386d23c8532b","Type":"ContainerStarted","Data":"55b7481862505aa398b26b4af18a702ced1cda0d0467d470ac7d7ae05dbd32f5"} Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.485795 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.492865 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.496778 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-scripts\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.496972 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.497506 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczdx\" (UniqueName: \"kubernetes.io/projected/402b6de2-fa43-4e17-abe7-3af33d08694a-kube-api-access-hczdx\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: E1002 11:37:08.502798 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-s6w77" podUID="6378c687-5c50-4efd-8cc5-b7aa4ef82297" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.542044 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-config-data\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.567747 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rrjlk"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.594249 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-dt2rk"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.596025 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.606567 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-dt2rk"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.621696 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.633380 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-565ccbd57b-kt62s"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.644979 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.648598 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.649776 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.650253 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.650895 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4r28x" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.654038 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klh74\" (UniqueName: \"kubernetes.io/projected/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-kube-api-access-klh74\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.654083 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.654133 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.654203 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.654239 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.654272 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.654317 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.681926 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-565ccbd57b-kt62s"] Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756325 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klh74\" (UniqueName: \"kubernetes.io/projected/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-kube-api-access-klh74\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756383 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756411 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756433 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-httpd-config\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756460 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-combined-ca-bundle\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756484 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756503 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-ovndb-tls-certs\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756531 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-config\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756562 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttrvz\" (UniqueName: \"kubernetes.io/projected/218ca390-9242-4dba-8899-0852cbc26bea-kube-api-access-ttrvz\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756576 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gczjd\" (UniqueName: \"kubernetes.io/projected/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-kube-api-access-gczjd\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756592 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756611 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756648 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756681 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756739 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756760 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756779 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-config\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.756798 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.761905 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.762502 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.762665 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.788217 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.792047 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.793751 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.839917 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klh74\" (UniqueName: \"kubernetes.io/projected/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-kube-api-access-klh74\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861276 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-ovndb-tls-certs\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861341 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-config\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861375 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttrvz\" (UniqueName: \"kubernetes.io/projected/218ca390-9242-4dba-8899-0852cbc26bea-kube-api-access-ttrvz\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861391 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gczjd\" (UniqueName: \"kubernetes.io/projected/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-kube-api-access-gczjd\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861409 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861427 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861508 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-config\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861528 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861564 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861584 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-httpd-config\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.861606 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-combined-ca-bundle\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.872079 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.872850 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.875468 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.897561 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.897650 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.898192 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-config\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.907093 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-combined-ca-bundle\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.914794 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.923622 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-httpd-config\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:08 crc kubenswrapper[4658]: I1002 11:37:08.924279 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-config\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.002590 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gczjd\" (UniqueName: \"kubernetes.io/projected/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-kube-api-access-gczjd\") pod \"dnsmasq-dns-84b966f6c9-dt2rk\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.002694 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttrvz\" (UniqueName: \"kubernetes.io/projected/218ca390-9242-4dba-8899-0852cbc26bea-kube-api-access-ttrvz\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.003095 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-ovndb-tls-certs\") pod \"neutron-565ccbd57b-kt62s\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.132521 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.230665 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.301657 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.559920 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-776f4bfd7b-cm7vj" event={"ID":"02408c48-14d8-4a7b-8ebf-79fd2fa1b924","Type":"ContainerStarted","Data":"7337e86a8cee452387028ba152fe16d3326866ad81fbbdb19e5ff05e580c60d7"} Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.571845 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78b685455c-5zn4s" event={"ID":"822259c6-fea2-44cb-9a09-d6415a92e71e","Type":"ContainerStarted","Data":"8600e21cfa45a23fc9cec0c62d4292779da66d2c8bb391a471760b22b24a0407"} Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.573270 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78b685455c-5zn4s" podUID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerName="horizon-log" containerID="cri-o://92c61fede454e4d930a8b2a7c7439bd70f8ca71bee1ff88d0510a17803277073" gracePeriod=30 Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.577436 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78b685455c-5zn4s" podUID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerName="horizon" containerID="cri-o://8600e21cfa45a23fc9cec0c62d4292779da66d2c8bb391a471760b22b24a0407" gracePeriod=30 Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.621789 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-78b685455c-5zn4s" podStartSLOduration=5.164246188 podStartE2EDuration="40.621753874s" podCreationTimestamp="2025-10-02 11:36:29 +0000 UTC" firstStartedPulling="2025-10-02 11:36:31.490466786 +0000 UTC m=+1072.381620353" lastFinishedPulling="2025-10-02 11:37:06.947974472 +0000 UTC m=+1107.839128039" observedRunningTime="2025-10-02 11:37:09.619382978 +0000 UTC m=+1110.510536545" watchObservedRunningTime="2025-10-02 11:37:09.621753874 +0000 UTC m=+1110.512907441" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.631722 4658 generic.go:334] "Generic (PLEG): container finished" podID="23ed6974-6382-47c3-89b3-6eef13014502" containerID="429fada1f06ca634b24c745633a4cd61ae3ca0e756021d42600c0fd0da6f951b" exitCode=0 Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.631839 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" event={"ID":"23ed6974-6382-47c3-89b3-6eef13014502","Type":"ContainerDied","Data":"429fada1f06ca634b24c745633a4cd61ae3ca0e756021d42600c0fd0da6f951b"} Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.631870 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" event={"ID":"23ed6974-6382-47c3-89b3-6eef13014502","Type":"ContainerStarted","Data":"82d385cd31211b1065e06a83775db35021a5049b41ee83c4d65ac94351ff4054"} Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.641757 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dc6wn" event={"ID":"916133b3-3541-40ec-b32a-4b8bf4870d7f","Type":"ContainerStarted","Data":"6a24cfcc9773fd7d06ec82cd910121ea08f1cec024a89c168796211b0510da67"} Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.641803 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dc6wn" event={"ID":"916133b3-3541-40ec-b32a-4b8bf4870d7f","Type":"ContainerStarted","Data":"abd1b4308b9ad8a460c57cac96df5f8a26895ad9aa2b07f5b86e73a0deac6315"} Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.665605 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"74fa8060-b33d-406a-aaa0-386d23c8532b","Type":"ContainerStarted","Data":"209d424c58b2dd299d676c2aa4377aed2a2319c6703c5804596f67401b167ef5"} Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.665657 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"74fa8060-b33d-406a-aaa0-386d23c8532b","Type":"ContainerStarted","Data":"6ada99d9e53070da995cc306284cd32368a0608d42250281379a37775f7a3e2a"} Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.666711 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.668179 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dbf7b8b8b-kj6xr" event={"ID":"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2","Type":"ContainerStarted","Data":"aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd"} Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.683974 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dc6wn" podStartSLOduration=17.683958654 podStartE2EDuration="17.683958654s" podCreationTimestamp="2025-10-02 11:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:09.682153065 +0000 UTC m=+1110.573306632" watchObservedRunningTime="2025-10-02 11:37:09.683958654 +0000 UTC m=+1110.575112211" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.693491 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.159:9322/\": dial tcp 10.217.0.159:9322: connect: connection refused" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.728347 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podStartSLOduration=30.728325732 podStartE2EDuration="30.728325732s" podCreationTimestamp="2025-10-02 11:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:09.703370654 +0000 UTC m=+1110.594524231" watchObservedRunningTime="2025-10-02 11:37:09.728325732 +0000 UTC m=+1110.619479309" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.745323 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=21.745303136 podStartE2EDuration="21.745303136s" podCreationTimestamp="2025-10-02 11:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:09.730719029 +0000 UTC m=+1110.621872596" watchObservedRunningTime="2025-10-02 11:37:09.745303136 +0000 UTC m=+1110.636456703" Oct 02 11:37:09 crc kubenswrapper[4658]: I1002 11:37:09.867522 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:37:09 crc kubenswrapper[4658]: W1002 11:37:09.965138 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod402b6de2_fa43_4e17_abe7_3af33d08694a.slice/crio-e523c98f332108e04609fcdcbaea94eb4d529496dcc43a10895d26ac42b0b326 WatchSource:0}: Error finding container e523c98f332108e04609fcdcbaea94eb4d529496dcc43a10895d26ac42b0b326: Status 404 returned error can't find the container with id e523c98f332108e04609fcdcbaea94eb4d529496dcc43a10895d26ac42b0b326 Oct 02 11:37:10 crc kubenswrapper[4658]: W1002 11:37:10.195425 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2345975_ffff_42aa_b6eb_0e33a17ba4a2.slice/crio-639d66b71f521eb477d1349074e8e586a1476c06cb75a531078a148bb8b41375 WatchSource:0}: Error finding container 639d66b71f521eb477d1349074e8e586a1476c06cb75a531078a148bb8b41375: Status 404 returned error can't find the container with id 639d66b71f521eb477d1349074e8e586a1476c06cb75a531078a148bb8b41375 Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.230537 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.245725 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-dt2rk"] Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.295971 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.441149 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.542279 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4gv9\" (UniqueName: \"kubernetes.io/projected/23ed6974-6382-47c3-89b3-6eef13014502-kube-api-access-h4gv9\") pod \"23ed6974-6382-47c3-89b3-6eef13014502\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.542363 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-sb\") pod \"23ed6974-6382-47c3-89b3-6eef13014502\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.542563 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-config\") pod \"23ed6974-6382-47c3-89b3-6eef13014502\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.542648 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-swift-storage-0\") pod \"23ed6974-6382-47c3-89b3-6eef13014502\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.542739 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-nb\") pod \"23ed6974-6382-47c3-89b3-6eef13014502\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.542858 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-svc\") pod \"23ed6974-6382-47c3-89b3-6eef13014502\" (UID: \"23ed6974-6382-47c3-89b3-6eef13014502\") " Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.561781 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ed6974-6382-47c3-89b3-6eef13014502-kube-api-access-h4gv9" (OuterVolumeSpecName: "kube-api-access-h4gv9") pod "23ed6974-6382-47c3-89b3-6eef13014502" (UID: "23ed6974-6382-47c3-89b3-6eef13014502"). InnerVolumeSpecName "kube-api-access-h4gv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.607310 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-565ccbd57b-kt62s"] Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.647055 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4gv9\" (UniqueName: \"kubernetes.io/projected/23ed6974-6382-47c3-89b3-6eef13014502-kube-api-access-h4gv9\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.708822 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "23ed6974-6382-47c3-89b3-6eef13014502" (UID: "23ed6974-6382-47c3-89b3-6eef13014502"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.712789 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "23ed6974-6382-47c3-89b3-6eef13014502" (UID: "23ed6974-6382-47c3-89b3-6eef13014502"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.720707 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "23ed6974-6382-47c3-89b3-6eef13014502" (UID: "23ed6974-6382-47c3-89b3-6eef13014502"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.720887 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "23ed6974-6382-47c3-89b3-6eef13014502" (UID: "23ed6974-6382-47c3-89b3-6eef13014502"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.723311 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-config" (OuterVolumeSpecName: "config") pod "23ed6974-6382-47c3-89b3-6eef13014502" (UID: "23ed6974-6382-47c3-89b3-6eef13014502"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.752192 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.752255 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.752288 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.752333 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.752346 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23ed6974-6382-47c3-89b3-6eef13014502-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.775128 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-776f4bfd7b-cm7vj" event={"ID":"02408c48-14d8-4a7b-8ebf-79fd2fa1b924","Type":"ContainerStarted","Data":"b902a68948536244db8695a6e4dd9a6e647d1be696ee4baa78124a8553dfffab"} Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.780377 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" event={"ID":"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e","Type":"ContainerStarted","Data":"cb59ec3dd88be11738ebcb1bd075df6e35f544de9cbbcf8b7dbb929e1dd29f15"} Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.788487 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"402b6de2-fa43-4e17-abe7-3af33d08694a","Type":"ContainerStarted","Data":"e523c98f332108e04609fcdcbaea94eb4d529496dcc43a10895d26ac42b0b326"} Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.793624 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" event={"ID":"23ed6974-6382-47c3-89b3-6eef13014502","Type":"ContainerDied","Data":"82d385cd31211b1065e06a83775db35021a5049b41ee83c4d65ac94351ff4054"} Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.793694 4658 scope.go:117] "RemoveContainer" containerID="429fada1f06ca634b24c745633a4cd61ae3ca0e756021d42600c0fd0da6f951b" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.793874 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-rrjlk" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.798639 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dbf7b8b8b-kj6xr" event={"ID":"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2","Type":"ContainerStarted","Data":"6d45f089b45e50f886b377a7177e755f763adec478d0b95d9b7dd867cd3a61a8"} Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.802552 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2345975-ffff-42aa-b6eb-0e33a17ba4a2","Type":"ContainerStarted","Data":"639d66b71f521eb477d1349074e8e586a1476c06cb75a531078a148bb8b41375"} Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.825272 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-776f4bfd7b-cm7vj" podStartSLOduration=31.825248818 podStartE2EDuration="31.825248818s" podCreationTimestamp="2025-10-02 11:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:10.807579922 +0000 UTC m=+1111.698733489" watchObservedRunningTime="2025-10-02 11:37:10.825248818 +0000 UTC m=+1111.716402385" Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.867054 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rrjlk"] Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.889940 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rrjlk"] Oct 02 11:37:10 crc kubenswrapper[4658]: I1002 11:37:10.930706 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:37:11 crc kubenswrapper[4658]: I1002 11:37:11.070009 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:37:11 crc kubenswrapper[4658]: I1002 11:37:11.826286 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2345975-ffff-42aa-b6eb-0e33a17ba4a2","Type":"ContainerStarted","Data":"8b9e6470937f047bcaf0df56a3ef9ffe882bc2fcf9ef43edeeb8eab50573ceb1"} Oct 02 11:37:11 crc kubenswrapper[4658]: I1002 11:37:11.828917 4658 generic.go:334] "Generic (PLEG): container finished" podID="a4602160-442e-4a87-bacb-3493da6f4dad" containerID="2903d0a3f21ea87cd2d81cd948f6918ca802a6305e105e94d14a48da913c5027" exitCode=0 Oct 02 11:37:11 crc kubenswrapper[4658]: I1002 11:37:11.828976 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9hqkv" event={"ID":"a4602160-442e-4a87-bacb-3493da6f4dad","Type":"ContainerDied","Data":"2903d0a3f21ea87cd2d81cd948f6918ca802a6305e105e94d14a48da913c5027"} Oct 02 11:37:11 crc kubenswrapper[4658]: I1002 11:37:11.831041 4658 generic.go:334] "Generic (PLEG): container finished" podID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" containerID="577c07b1fada8fcf0ae5fda670b69dd3dd76b12419431abed82afd10b6ea93bf" exitCode=0 Oct 02 11:37:11 crc kubenswrapper[4658]: I1002 11:37:11.831110 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" event={"ID":"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e","Type":"ContainerDied","Data":"577c07b1fada8fcf0ae5fda670b69dd3dd76b12419431abed82afd10b6ea93bf"} Oct 02 11:37:11 crc kubenswrapper[4658]: I1002 11:37:11.832739 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"402b6de2-fa43-4e17-abe7-3af33d08694a","Type":"ContainerStarted","Data":"1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4"} Oct 02 11:37:11 crc kubenswrapper[4658]: I1002 11:37:11.979410 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ed6974-6382-47c3-89b3-6eef13014502" path="/var/lib/kubelet/pods/23ed6974-6382-47c3-89b3-6eef13014502/volumes" Oct 02 11:37:12 crc kubenswrapper[4658]: I1002 11:37:12.875715 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-565ccbd57b-kt62s" event={"ID":"218ca390-9242-4dba-8899-0852cbc26bea","Type":"ContainerStarted","Data":"d78dc895ea1f5f03b58b8b06d6046e0521925ac48eac5bc332224a9e7c9be1ee"} Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.394358 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6989c4ffd5-z7vdb"] Oct 02 11:37:13 crc kubenswrapper[4658]: E1002 11:37:13.396737 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ed6974-6382-47c3-89b3-6eef13014502" containerName="init" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.397088 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ed6974-6382-47c3-89b3-6eef13014502" containerName="init" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.399809 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ed6974-6382-47c3-89b3-6eef13014502" containerName="init" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.401158 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.403052 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.407720 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.418128 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6989c4ffd5-z7vdb"] Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.446305 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.446418 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.517456 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-combined-ca-bundle\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.517506 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-public-tls-certs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.517602 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-config\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.517672 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-internal-tls-certs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.517709 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-httpd-config\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.517776 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-ovndb-tls-certs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.517800 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g4qs\" (UniqueName: \"kubernetes.io/projected/299ba238-fcb8-4f4b-94ea-73ac08404680-kube-api-access-6g4qs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.621381 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-config\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.621460 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-internal-tls-certs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.621499 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-httpd-config\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.621562 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-ovndb-tls-certs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.621583 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g4qs\" (UniqueName: \"kubernetes.io/projected/299ba238-fcb8-4f4b-94ea-73ac08404680-kube-api-access-6g4qs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.621673 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-combined-ca-bundle\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.621697 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-public-tls-certs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.628202 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-public-tls-certs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.632233 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-config\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.637584 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-internal-tls-certs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.642083 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-ovndb-tls-certs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.647180 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-combined-ca-bundle\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.651844 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/299ba238-fcb8-4f4b-94ea-73ac08404680-httpd-config\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.673968 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g4qs\" (UniqueName: \"kubernetes.io/projected/299ba238-fcb8-4f4b-94ea-73ac08404680-kube-api-access-6g4qs\") pod \"neutron-6989c4ffd5-z7vdb\" (UID: \"299ba238-fcb8-4f4b-94ea-73ac08404680\") " pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.723431 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.975171 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9hqkv" event={"ID":"a4602160-442e-4a87-bacb-3493da6f4dad","Type":"ContainerDied","Data":"fb0a1641c88a61d611472c7628124f845c323260ac2683d0dab3e49137415f38"} Oct 02 11:37:13 crc kubenswrapper[4658]: I1002 11:37:13.975540 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb0a1641c88a61d611472c7628124f845c323260ac2683d0dab3e49137415f38" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.060767 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9hqkv" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.265949 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-combined-ca-bundle\") pod \"a4602160-442e-4a87-bacb-3493da6f4dad\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.266442 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-config-data\") pod \"a4602160-442e-4a87-bacb-3493da6f4dad\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.266598 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-scripts\") pod \"a4602160-442e-4a87-bacb-3493da6f4dad\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.266639 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47s9v\" (UniqueName: \"kubernetes.io/projected/a4602160-442e-4a87-bacb-3493da6f4dad-kube-api-access-47s9v\") pod \"a4602160-442e-4a87-bacb-3493da6f4dad\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.266762 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4602160-442e-4a87-bacb-3493da6f4dad-logs\") pod \"a4602160-442e-4a87-bacb-3493da6f4dad\" (UID: \"a4602160-442e-4a87-bacb-3493da6f4dad\") " Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.267751 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4602160-442e-4a87-bacb-3493da6f4dad-logs" (OuterVolumeSpecName: "logs") pod "a4602160-442e-4a87-bacb-3493da6f4dad" (UID: "a4602160-442e-4a87-bacb-3493da6f4dad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.319543 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4602160-442e-4a87-bacb-3493da6f4dad-kube-api-access-47s9v" (OuterVolumeSpecName: "kube-api-access-47s9v") pod "a4602160-442e-4a87-bacb-3493da6f4dad" (UID: "a4602160-442e-4a87-bacb-3493da6f4dad"). InnerVolumeSpecName "kube-api-access-47s9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.348501 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-scripts" (OuterVolumeSpecName: "scripts") pod "a4602160-442e-4a87-bacb-3493da6f4dad" (UID: "a4602160-442e-4a87-bacb-3493da6f4dad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.369837 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4602160-442e-4a87-bacb-3493da6f4dad-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.369860 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.369871 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47s9v\" (UniqueName: \"kubernetes.io/projected/a4602160-442e-4a87-bacb-3493da6f4dad-kube-api-access-47s9v\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.489368 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6989c4ffd5-z7vdb"] Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.702342 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-config-data" (OuterVolumeSpecName: "config-data") pod "a4602160-442e-4a87-bacb-3493da6f4dad" (UID: "a4602160-442e-4a87-bacb-3493da6f4dad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.754926 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4602160-442e-4a87-bacb-3493da6f4dad" (UID: "a4602160-442e-4a87-bacb-3493da6f4dad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.778847 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.778893 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4602160-442e-4a87-bacb-3493da6f4dad-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.867458 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.964427 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9hqkv" Oct 02 11:37:14 crc kubenswrapper[4658]: I1002 11:37:14.964597 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6989c4ffd5-z7vdb" event={"ID":"299ba238-fcb8-4f4b-94ea-73ac08404680","Type":"ContainerStarted","Data":"3cda09415ea35f83cdcad812af74d0f5c5fa9e1aa38b6430bba29cdce865d0c0"} Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.300200 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-574d544bd8-7g449"] Oct 02 11:37:15 crc kubenswrapper[4658]: E1002 11:37:15.300908 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4602160-442e-4a87-bacb-3493da6f4dad" containerName="placement-db-sync" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.300922 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4602160-442e-4a87-bacb-3493da6f4dad" containerName="placement-db-sync" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.301116 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4602160-442e-4a87-bacb-3493da6f4dad" containerName="placement-db-sync" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.303071 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.305418 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.306585 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.306700 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.307621 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.315982 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p4wj8" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.323768 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-574d544bd8-7g449"] Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.490545 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t96tv\" (UniqueName: \"kubernetes.io/projected/c77ff071-5d94-49df-a4b3-25c8dd727b6e-kube-api-access-t96tv\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.490614 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-internal-tls-certs\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.490703 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-scripts\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.490799 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-public-tls-certs\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.490860 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-combined-ca-bundle\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.490915 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c77ff071-5d94-49df-a4b3-25c8dd727b6e-logs\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.490940 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-config-data\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.596129 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-public-tls-certs\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.596594 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-combined-ca-bundle\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.596685 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c77ff071-5d94-49df-a4b3-25c8dd727b6e-logs\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.596712 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-config-data\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.596939 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t96tv\" (UniqueName: \"kubernetes.io/projected/c77ff071-5d94-49df-a4b3-25c8dd727b6e-kube-api-access-t96tv\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.600405 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-internal-tls-certs\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.601908 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-scripts\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.602549 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c77ff071-5d94-49df-a4b3-25c8dd727b6e-logs\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.610456 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-scripts\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.616227 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-config-data\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.618794 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-internal-tls-certs\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.624206 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-public-tls-certs\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.624973 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77ff071-5d94-49df-a4b3-25c8dd727b6e-combined-ca-bundle\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.637207 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t96tv\" (UniqueName: \"kubernetes.io/projected/c77ff071-5d94-49df-a4b3-25c8dd727b6e-kube-api-access-t96tv\") pod \"placement-574d544bd8-7g449\" (UID: \"c77ff071-5d94-49df-a4b3-25c8dd727b6e\") " pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:15 crc kubenswrapper[4658]: I1002 11:37:15.920158 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.010760 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"402b6de2-fa43-4e17-abe7-3af33d08694a","Type":"ContainerStarted","Data":"ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611"} Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.011006 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerName="glance-log" containerID="cri-o://1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4" gracePeriod=30 Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.011177 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerName="glance-httpd" containerID="cri-o://ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611" gracePeriod=30 Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.019093 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"34ba94d4-e1db-40a9-93e7-5a4e053ae8db","Type":"ContainerStarted","Data":"12d8979e996b6c2195b8d8367d60fc1a05f60a233e7cc09f1609d887891b1cd8"} Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.024790 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" event={"ID":"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e","Type":"ContainerStarted","Data":"8b8978037794019d6beb72573549347d8a8c005dcddd7b968afa1939cd31e5b2"} Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.025435 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.046475 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6989c4ffd5-z7vdb" event={"ID":"299ba238-fcb8-4f4b-94ea-73ac08404680","Type":"ContainerStarted","Data":"190297db07854b2f31da03365fddabb6e4ed79df80df3bebd7bce10a2046204b"} Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.046841 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.058993 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.058971229 podStartE2EDuration="9.058971229s" podCreationTimestamp="2025-10-02 11:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:16.042922385 +0000 UTC m=+1116.934075952" watchObservedRunningTime="2025-10-02 11:37:16.058971229 +0000 UTC m=+1116.950124796" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.068438 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-565ccbd57b-kt62s" event={"ID":"218ca390-9242-4dba-8899-0852cbc26bea","Type":"ContainerStarted","Data":"a87b6456f4a87208b841bf7062661b0b0d8c6155e0207147a6e0c052f784ccd0"} Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.068477 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.079384 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=22.479960293 podStartE2EDuration="28.079361051s" podCreationTimestamp="2025-10-02 11:36:48 +0000 UTC" firstStartedPulling="2025-10-02 11:37:08.122933843 +0000 UTC m=+1109.014087410" lastFinishedPulling="2025-10-02 11:37:13.722334591 +0000 UTC m=+1114.613488168" observedRunningTime="2025-10-02 11:37:16.073266566 +0000 UTC m=+1116.964420133" watchObservedRunningTime="2025-10-02 11:37:16.079361051 +0000 UTC m=+1116.970514618" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.094316 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerStarted","Data":"630370b49e9083126045ee666e011d798ddc1cb0fc91e00dc2c8d769d93a324d"} Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.117403 4658 generic.go:334] "Generic (PLEG): container finished" podID="916133b3-3541-40ec-b32a-4b8bf4870d7f" containerID="6a24cfcc9773fd7d06ec82cd910121ea08f1cec024a89c168796211b0510da67" exitCode=0 Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.117499 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dc6wn" event={"ID":"916133b3-3541-40ec-b32a-4b8bf4870d7f","Type":"ContainerDied","Data":"6a24cfcc9773fd7d06ec82cd910121ea08f1cec024a89c168796211b0510da67"} Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.119551 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" podStartSLOduration=8.119532206 podStartE2EDuration="8.119532206s" podCreationTimestamp="2025-10-02 11:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:16.106168079 +0000 UTC m=+1116.997321666" watchObservedRunningTime="2025-10-02 11:37:16.119532206 +0000 UTC m=+1117.010685773" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.134881 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"dba2292e-4150-4a9d-9b22-49482e381c6c","Type":"ContainerStarted","Data":"251c22aab5b404659f7939d66c9c379aef7691f89b8ac97b3247b1e9d6f382c4"} Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.143842 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2345975-ffff-42aa-b6eb-0e33a17ba4a2","Type":"ContainerStarted","Data":"31f18f240862b19b43cd0e2d5744d027c1f987b73a461d2472a9db58364ade72"} Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.143978 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerName="glance-log" containerID="cri-o://8b9e6470937f047bcaf0df56a3ef9ffe882bc2fcf9ef43edeeb8eab50573ceb1" gracePeriod=30 Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.144258 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerName="glance-httpd" containerID="cri-o://31f18f240862b19b43cd0e2d5744d027c1f987b73a461d2472a9db58364ade72" gracePeriod=30 Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.162793 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-565ccbd57b-kt62s" podStartSLOduration=8.162769049 podStartE2EDuration="8.162769049s" podCreationTimestamp="2025-10-02 11:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:16.133762361 +0000 UTC m=+1117.024915928" watchObservedRunningTime="2025-10-02 11:37:16.162769049 +0000 UTC m=+1117.053922616" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.178539 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6989c4ffd5-z7vdb" podStartSLOduration=3.178518313 podStartE2EDuration="3.178518313s" podCreationTimestamp="2025-10-02 11:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:16.171811418 +0000 UTC m=+1117.062964985" watchObservedRunningTime="2025-10-02 11:37:16.178518313 +0000 UTC m=+1117.069671880" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.214444 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.214421671 podStartE2EDuration="9.214421671s" podCreationTimestamp="2025-10-02 11:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:16.203701468 +0000 UTC m=+1117.094855035" watchObservedRunningTime="2025-10-02 11:37:16.214421671 +0000 UTC m=+1117.105575228" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.252490 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=23.646731267 podStartE2EDuration="29.252464277s" podCreationTimestamp="2025-10-02 11:36:47 +0000 UTC" firstStartedPulling="2025-10-02 11:37:08.115036091 +0000 UTC m=+1109.006189658" lastFinishedPulling="2025-10-02 11:37:13.720769101 +0000 UTC m=+1114.611922668" observedRunningTime="2025-10-02 11:37:16.250848806 +0000 UTC m=+1117.142002373" watchObservedRunningTime="2025-10-02 11:37:16.252464277 +0000 UTC m=+1117.143617854" Oct 02 11:37:16 crc kubenswrapper[4658]: E1002 11:37:16.571884 4658 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod402b6de2_fa43_4e17_abe7_3af33d08694a.slice/crio-ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod402b6de2_fa43_4e17_abe7_3af33d08694a.slice/crio-conmon-ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:37:16 crc kubenswrapper[4658]: I1002 11:37:16.652284 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-574d544bd8-7g449"] Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.125063 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.167456 4658 generic.go:334] "Generic (PLEG): container finished" podID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerID="ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611" exitCode=0 Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.167488 4658 generic.go:334] "Generic (PLEG): container finished" podID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerID="1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4" exitCode=143 Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.167528 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"402b6de2-fa43-4e17-abe7-3af33d08694a","Type":"ContainerDied","Data":"ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611"} Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.167566 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"402b6de2-fa43-4e17-abe7-3af33d08694a","Type":"ContainerDied","Data":"1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4"} Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.167577 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"402b6de2-fa43-4e17-abe7-3af33d08694a","Type":"ContainerDied","Data":"e523c98f332108e04609fcdcbaea94eb4d529496dcc43a10895d26ac42b0b326"} Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.167593 4658 scope.go:117] "RemoveContainer" containerID="ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.167618 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.174860 4658 generic.go:334] "Generic (PLEG): container finished" podID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerID="31f18f240862b19b43cd0e2d5744d027c1f987b73a461d2472a9db58364ade72" exitCode=0 Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.174887 4658 generic.go:334] "Generic (PLEG): container finished" podID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerID="8b9e6470937f047bcaf0df56a3ef9ffe882bc2fcf9ef43edeeb8eab50573ceb1" exitCode=143 Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.174926 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2345975-ffff-42aa-b6eb-0e33a17ba4a2","Type":"ContainerDied","Data":"31f18f240862b19b43cd0e2d5744d027c1f987b73a461d2472a9db58364ade72"} Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.174951 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2345975-ffff-42aa-b6eb-0e33a17ba4a2","Type":"ContainerDied","Data":"8b9e6470937f047bcaf0df56a3ef9ffe882bc2fcf9ef43edeeb8eab50573ceb1"} Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.177514 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-565ccbd57b-kt62s" event={"ID":"218ca390-9242-4dba-8899-0852cbc26bea","Type":"ContainerStarted","Data":"631d442b4a9a3c355372f7c5b9ab59e21341e6a46715ca6efb040d0d65ee0f67"} Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.187255 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-574d544bd8-7g449" event={"ID":"c77ff071-5d94-49df-a4b3-25c8dd727b6e","Type":"ContainerStarted","Data":"442de8cd54d722341f31ca4235dd3316ba18a8060651400e9a299f52657bbb47"} Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.198827 4658 scope.go:117] "RemoveContainer" containerID="1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.210350 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6989c4ffd5-z7vdb" event={"ID":"299ba238-fcb8-4f4b-94ea-73ac08404680","Type":"ContainerStarted","Data":"082273a5a93a4a6f0e1ae2fabb55d800add66d73d2ef6cae3cc9b6d23c2c07f9"} Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.261156 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"402b6de2-fa43-4e17-abe7-3af33d08694a\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.261216 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hczdx\" (UniqueName: \"kubernetes.io/projected/402b6de2-fa43-4e17-abe7-3af33d08694a-kube-api-access-hczdx\") pod \"402b6de2-fa43-4e17-abe7-3af33d08694a\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.261271 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-logs\") pod \"402b6de2-fa43-4e17-abe7-3af33d08694a\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.261345 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-combined-ca-bundle\") pod \"402b6de2-fa43-4e17-abe7-3af33d08694a\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.261378 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-scripts\") pod \"402b6de2-fa43-4e17-abe7-3af33d08694a\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.261498 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-httpd-run\") pod \"402b6de2-fa43-4e17-abe7-3af33d08694a\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.261539 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-config-data\") pod \"402b6de2-fa43-4e17-abe7-3af33d08694a\" (UID: \"402b6de2-fa43-4e17-abe7-3af33d08694a\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.262152 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-logs" (OuterVolumeSpecName: "logs") pod "402b6de2-fa43-4e17-abe7-3af33d08694a" (UID: "402b6de2-fa43-4e17-abe7-3af33d08694a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.262476 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "402b6de2-fa43-4e17-abe7-3af33d08694a" (UID: "402b6de2-fa43-4e17-abe7-3af33d08694a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.275705 4658 scope.go:117] "RemoveContainer" containerID="ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.277456 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402b6de2-fa43-4e17-abe7-3af33d08694a-kube-api-access-hczdx" (OuterVolumeSpecName: "kube-api-access-hczdx") pod "402b6de2-fa43-4e17-abe7-3af33d08694a" (UID: "402b6de2-fa43-4e17-abe7-3af33d08694a"). InnerVolumeSpecName "kube-api-access-hczdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: E1002 11:37:17.282513 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611\": container with ID starting with ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611 not found: ID does not exist" containerID="ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.282597 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611"} err="failed to get container status \"ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611\": rpc error: code = NotFound desc = could not find container \"ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611\": container with ID starting with ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611 not found: ID does not exist" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.282628 4658 scope.go:117] "RemoveContainer" containerID="1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.282622 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-scripts" (OuterVolumeSpecName: "scripts") pod "402b6de2-fa43-4e17-abe7-3af33d08694a" (UID: "402b6de2-fa43-4e17-abe7-3af33d08694a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: E1002 11:37:17.287934 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4\": container with ID starting with 1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4 not found: ID does not exist" containerID="1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.287979 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4"} err="failed to get container status \"1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4\": rpc error: code = NotFound desc = could not find container \"1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4\": container with ID starting with 1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4 not found: ID does not exist" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.288008 4658 scope.go:117] "RemoveContainer" containerID="ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.291613 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611"} err="failed to get container status \"ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611\": rpc error: code = NotFound desc = could not find container \"ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611\": container with ID starting with ed8ebddb007dfb26125077dae3e5f20985a7a421ef19ca29efcc6830f6d57611 not found: ID does not exist" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.291663 4658 scope.go:117] "RemoveContainer" containerID="1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.294697 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4"} err="failed to get container status \"1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4\": rpc error: code = NotFound desc = could not find container \"1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4\": container with ID starting with 1eba5984f034caaf12690440cdfaaaf044a37f288512ebcba3da87e118c9b5d4 not found: ID does not exist" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.297485 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "402b6de2-fa43-4e17-abe7-3af33d08694a" (UID: "402b6de2-fa43-4e17-abe7-3af33d08694a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.361410 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "402b6de2-fa43-4e17-abe7-3af33d08694a" (UID: "402b6de2-fa43-4e17-abe7-3af33d08694a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.364326 4658 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.364365 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hczdx\" (UniqueName: \"kubernetes.io/projected/402b6de2-fa43-4e17-abe7-3af33d08694a-kube-api-access-hczdx\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.364381 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.364394 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.364406 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.364418 4658 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/402b6de2-fa43-4e17-abe7-3af33d08694a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.401640 4658 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.433077 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-config-data" (OuterVolumeSpecName: "config-data") pod "402b6de2-fa43-4e17-abe7-3af33d08694a" (UID: "402b6de2-fa43-4e17-abe7-3af33d08694a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.466216 4658 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.466257 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402b6de2-fa43-4e17-abe7-3af33d08694a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.475891 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.568838 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.568925 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-config-data\") pod \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.569000 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-scripts\") pod \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.569041 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-logs\") pod \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.569063 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-combined-ca-bundle\") pod \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.569174 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klh74\" (UniqueName: \"kubernetes.io/projected/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-kube-api-access-klh74\") pod \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.569199 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-httpd-run\") pod \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\" (UID: \"a2345975-ffff-42aa-b6eb-0e33a17ba4a2\") " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.569929 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a2345975-ffff-42aa-b6eb-0e33a17ba4a2" (UID: "a2345975-ffff-42aa-b6eb-0e33a17ba4a2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.571165 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-logs" (OuterVolumeSpecName: "logs") pod "a2345975-ffff-42aa-b6eb-0e33a17ba4a2" (UID: "a2345975-ffff-42aa-b6eb-0e33a17ba4a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.586516 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "a2345975-ffff-42aa-b6eb-0e33a17ba4a2" (UID: "a2345975-ffff-42aa-b6eb-0e33a17ba4a2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.604448 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-scripts" (OuterVolumeSpecName: "scripts") pod "a2345975-ffff-42aa-b6eb-0e33a17ba4a2" (UID: "a2345975-ffff-42aa-b6eb-0e33a17ba4a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.609563 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.632551 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-kube-api-access-klh74" (OuterVolumeSpecName: "kube-api-access-klh74") pod "a2345975-ffff-42aa-b6eb-0e33a17ba4a2" (UID: "a2345975-ffff-42aa-b6eb-0e33a17ba4a2"). InnerVolumeSpecName "kube-api-access-klh74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.636904 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.651880 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2345975-ffff-42aa-b6eb-0e33a17ba4a2" (UID: "a2345975-ffff-42aa-b6eb-0e33a17ba4a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.675759 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:37:17 crc kubenswrapper[4658]: E1002 11:37:17.676414 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerName="glance-log" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.676516 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerName="glance-log" Oct 02 11:37:17 crc kubenswrapper[4658]: E1002 11:37:17.677113 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerName="glance-httpd" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.677618 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerName="glance-httpd" Oct 02 11:37:17 crc kubenswrapper[4658]: E1002 11:37:17.678010 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerName="glance-log" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.678306 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerName="glance-log" Oct 02 11:37:17 crc kubenswrapper[4658]: E1002 11:37:17.678419 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerName="glance-httpd" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.678496 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerName="glance-httpd" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.678783 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerName="glance-log" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.678870 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="402b6de2-fa43-4e17-abe7-3af33d08694a" containerName="glance-httpd" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.678953 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerName="glance-log" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.679019 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" containerName="glance-httpd" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.680281 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.690237 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.690896 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.693255 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.693282 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.702078 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.702114 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klh74\" (UniqueName: \"kubernetes.io/projected/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-kube-api-access-klh74\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.702129 4658 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.702159 4658 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.731549 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-config-data" (OuterVolumeSpecName: "config-data") pod "a2345975-ffff-42aa-b6eb-0e33a17ba4a2" (UID: "a2345975-ffff-42aa-b6eb-0e33a17ba4a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.742133 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.765561 4658 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.804601 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-scripts\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.805666 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.805877 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.806041 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-config-data\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.806103 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.806132 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.806523 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc72m\" (UniqueName: \"kubernetes.io/projected/546e3884-d904-4d23-853e-6855aee00e02-kube-api-access-fc72m\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.806884 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-logs\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.806959 4658 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.806976 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2345975-ffff-42aa-b6eb-0e33a17ba4a2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.866456 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.909125 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc72m\" (UniqueName: \"kubernetes.io/projected/546e3884-d904-4d23-853e-6855aee00e02-kube-api-access-fc72m\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.909215 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-logs\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.910827 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-logs\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.909267 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-scripts\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.910978 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.911008 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.911052 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-config-data\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.911090 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.911116 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.911554 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.915286 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.921328 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.922012 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-scripts\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.928335 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-config-data\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.934625 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.937606 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc72m\" (UniqueName: \"kubernetes.io/projected/546e3884-d904-4d23-853e-6855aee00e02-kube-api-access-fc72m\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.950307 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " pod="openstack/glance-default-external-api-0" Oct 02 11:37:17 crc kubenswrapper[4658]: I1002 11:37:17.975544 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402b6de2-fa43-4e17-abe7-3af33d08694a" path="/var/lib/kubelet/pods/402b6de2-fa43-4e17-abe7-3af33d08694a/volumes" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.012174 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-scripts\") pod \"916133b3-3541-40ec-b32a-4b8bf4870d7f\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.012233 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-config-data\") pod \"916133b3-3541-40ec-b32a-4b8bf4870d7f\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.012334 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-fernet-keys\") pod \"916133b3-3541-40ec-b32a-4b8bf4870d7f\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.012441 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-credential-keys\") pod \"916133b3-3541-40ec-b32a-4b8bf4870d7f\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.012472 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-combined-ca-bundle\") pod \"916133b3-3541-40ec-b32a-4b8bf4870d7f\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.012554 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkj4z\" (UniqueName: \"kubernetes.io/projected/916133b3-3541-40ec-b32a-4b8bf4870d7f-kube-api-access-rkj4z\") pod \"916133b3-3541-40ec-b32a-4b8bf4870d7f\" (UID: \"916133b3-3541-40ec-b32a-4b8bf4870d7f\") " Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.024420 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "916133b3-3541-40ec-b32a-4b8bf4870d7f" (UID: "916133b3-3541-40ec-b32a-4b8bf4870d7f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.024694 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916133b3-3541-40ec-b32a-4b8bf4870d7f-kube-api-access-rkj4z" (OuterVolumeSpecName: "kube-api-access-rkj4z") pod "916133b3-3541-40ec-b32a-4b8bf4870d7f" (UID: "916133b3-3541-40ec-b32a-4b8bf4870d7f"). InnerVolumeSpecName "kube-api-access-rkj4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.024690 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-scripts" (OuterVolumeSpecName: "scripts") pod "916133b3-3541-40ec-b32a-4b8bf4870d7f" (UID: "916133b3-3541-40ec-b32a-4b8bf4870d7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.024828 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "916133b3-3541-40ec-b32a-4b8bf4870d7f" (UID: "916133b3-3541-40ec-b32a-4b8bf4870d7f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.051328 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "916133b3-3541-40ec-b32a-4b8bf4870d7f" (UID: "916133b3-3541-40ec-b32a-4b8bf4870d7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.071547 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-config-data" (OuterVolumeSpecName: "config-data") pod "916133b3-3541-40ec-b32a-4b8bf4870d7f" (UID: "916133b3-3541-40ec-b32a-4b8bf4870d7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.114669 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.114919 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.115026 4658 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.115116 4658 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.115200 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916133b3-3541-40ec-b32a-4b8bf4870d7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.115281 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkj4z\" (UniqueName: \"kubernetes.io/projected/916133b3-3541-40ec-b32a-4b8bf4870d7f-kube-api-access-rkj4z\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.163246 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.222977 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-574d544bd8-7g449" event={"ID":"c77ff071-5d94-49df-a4b3-25c8dd727b6e","Type":"ContainerStarted","Data":"8caf4a9289f222232546ed8c3e94723fd22888b41deac692d3d59fe014a54a83"} Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.223018 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-574d544bd8-7g449" event={"ID":"c77ff071-5d94-49df-a4b3-25c8dd727b6e","Type":"ContainerStarted","Data":"36e104995d4afa38a0be8aad1fdf8c8d73584f6500e9c7346f98026acfdcd914"} Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.223150 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.227076 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dc6wn" event={"ID":"916133b3-3541-40ec-b32a-4b8bf4870d7f","Type":"ContainerDied","Data":"abd1b4308b9ad8a460c57cac96df5f8a26895ad9aa2b07f5b86e73a0deac6315"} Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.227099 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abd1b4308b9ad8a460c57cac96df5f8a26895ad9aa2b07f5b86e73a0deac6315" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.227142 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dc6wn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.249678 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2345975-ffff-42aa-b6eb-0e33a17ba4a2","Type":"ContainerDied","Data":"639d66b71f521eb477d1349074e8e586a1476c06cb75a531078a148bb8b41375"} Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.249737 4658 scope.go:117] "RemoveContainer" containerID="31f18f240862b19b43cd0e2d5744d027c1f987b73a461d2472a9db58364ade72" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.250643 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.262530 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-574d544bd8-7g449" podStartSLOduration=3.26251215 podStartE2EDuration="3.26251215s" podCreationTimestamp="2025-10-02 11:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:18.248762459 +0000 UTC m=+1119.139916026" watchObservedRunningTime="2025-10-02 11:37:18.26251215 +0000 UTC m=+1119.153665717" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.298190 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.307205 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.326172 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:37:18 crc kubenswrapper[4658]: E1002 11:37:18.326627 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916133b3-3541-40ec-b32a-4b8bf4870d7f" containerName="keystone-bootstrap" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.326643 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="916133b3-3541-40ec-b32a-4b8bf4870d7f" containerName="keystone-bootstrap" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.326874 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="916133b3-3541-40ec-b32a-4b8bf4870d7f" containerName="keystone-bootstrap" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.327862 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.330055 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.334287 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.334425 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.375887 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.375950 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.424588 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.424630 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.424718 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.424757 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg2rg\" (UniqueName: \"kubernetes.io/projected/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-kube-api-access-lg2rg\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.424883 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.424934 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.425016 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.425036 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.438371 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.441635 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.456150 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7db8df9d95-jgkgn"] Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.465256 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.466619 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.468617 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.472180 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db8df9d95-jgkgn"] Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.487689 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.487798 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.487799 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.487856 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.487944 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bltpc" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.512049 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.512099 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526554 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526612 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-internal-tls-certs\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526632 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c9hk\" (UniqueName: \"kubernetes.io/projected/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-kube-api-access-6c9hk\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526665 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526721 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-fernet-keys\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526741 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526758 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526773 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-scripts\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526798 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-public-tls-certs\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526816 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526837 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526882 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-combined-ca-bundle\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526899 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526921 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg2rg\" (UniqueName: \"kubernetes.io/projected/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-kube-api-access-lg2rg\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526950 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-config-data\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.526987 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-credential-keys\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.528673 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.539442 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.540137 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.549357 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.553797 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.554964 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.565458 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.572670 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.574264 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.574415 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg2rg\" (UniqueName: \"kubernetes.io/projected/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-kube-api-access-lg2rg\") pod \"glance-default-internal-api-0\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.629057 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-internal-tls-certs\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.629101 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c9hk\" (UniqueName: \"kubernetes.io/projected/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-kube-api-access-6c9hk\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.629152 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-fernet-keys\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.629179 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-scripts\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.629201 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-public-tls-certs\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.629317 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-combined-ca-bundle\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.629920 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-config-data\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.629979 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-credential-keys\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.634998 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-config-data\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.636901 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-combined-ca-bundle\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.638252 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-scripts\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.644776 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-internal-tls-certs\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.644964 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-credential-keys\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.645324 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-fernet-keys\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.649064 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c9hk\" (UniqueName: \"kubernetes.io/projected/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-kube-api-access-6c9hk\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.655122 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e57e6b14-51e6-4efb-ba74-8e57b5e3aa72-public-tls-certs\") pod \"keystone-7db8df9d95-jgkgn\" (UID: \"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72\") " pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.662901 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:18 crc kubenswrapper[4658]: I1002 11:37:18.812133 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.264558 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.270754 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.313711 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.315917 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.543826 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.543937 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.549822 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.596935 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.596981 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.604020 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-776f4bfd7b-cm7vj" podUID="02408c48-14d8-4a7b-8ebf-79fd2fa1b924" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.614435 4658 scope.go:117] "RemoveContainer" containerID="8b9e6470937f047bcaf0df56a3ef9ffe882bc2fcf9ef43edeeb8eab50573ceb1" Oct 02 11:37:19 crc kubenswrapper[4658]: I1002 11:37:19.980393 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2345975-ffff-42aa-b6eb-0e33a17ba4a2" path="/var/lib/kubelet/pods/a2345975-ffff-42aa-b6eb-0e33a17ba4a2/volumes" Oct 02 11:37:20 crc kubenswrapper[4658]: I1002 11:37:20.309677 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d5ppn" event={"ID":"057d8045-79f8-4f4d-9b29-ce1f517e0f94","Type":"ContainerStarted","Data":"0e571d20008829a47fd4be592c00b67333af9598c5d53ca600d02c8ff788d8e4"} Oct 02 11:37:20 crc kubenswrapper[4658]: I1002 11:37:20.323136 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db8df9d95-jgkgn"] Oct 02 11:37:20 crc kubenswrapper[4658]: I1002 11:37:20.332568 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-d5ppn" podStartSLOduration=3.171648484 podStartE2EDuration="51.33254926s" podCreationTimestamp="2025-10-02 11:36:29 +0000 UTC" firstStartedPulling="2025-10-02 11:36:31.49028363 +0000 UTC m=+1072.381437187" lastFinishedPulling="2025-10-02 11:37:19.651184396 +0000 UTC m=+1120.542337963" observedRunningTime="2025-10-02 11:37:20.3300629 +0000 UTC m=+1121.221216477" watchObservedRunningTime="2025-10-02 11:37:20.33254926 +0000 UTC m=+1121.223702827" Oct 02 11:37:20 crc kubenswrapper[4658]: I1002 11:37:20.575718 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:37:20 crc kubenswrapper[4658]: I1002 11:37:20.726598 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:37:21 crc kubenswrapper[4658]: I1002 11:37:21.365273 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"546e3884-d904-4d23-853e-6855aee00e02","Type":"ContainerStarted","Data":"98de9a6095fd432b0fd27feac437a484bc2a1a1da6f0f448799a3236211c0082"} Oct 02 11:37:21 crc kubenswrapper[4658]: I1002 11:37:21.411899 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0f090b2-4ffc-4a27-b8ee-52a6912bf436","Type":"ContainerStarted","Data":"090de77937cd45f43ea44834270934c1abf42921e83bbcac941b3e246beb5d1c"} Oct 02 11:37:21 crc kubenswrapper[4658]: I1002 11:37:21.459538 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db8df9d95-jgkgn" event={"ID":"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72","Type":"ContainerStarted","Data":"71fb945605e411a995d15fe61577f27a332fed4766c56a7e917b8e99d52e5122"} Oct 02 11:37:21 crc kubenswrapper[4658]: I1002 11:37:21.459585 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db8df9d95-jgkgn" event={"ID":"e57e6b14-51e6-4efb-ba74-8e57b5e3aa72","Type":"ContainerStarted","Data":"a5f4d8021a71ae4e8958ea9063e3e4c833f9745896feeaf2013b63e39da50654"} Oct 02 11:37:21 crc kubenswrapper[4658]: I1002 11:37:21.460713 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:22 crc kubenswrapper[4658]: I1002 11:37:22.552592 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0f090b2-4ffc-4a27-b8ee-52a6912bf436","Type":"ContainerStarted","Data":"fd3c33c27c7fdf827fbdc2f57eeac55ed71e2e8399a0b6def13bbd35922bfcd3"} Oct 02 11:37:22 crc kubenswrapper[4658]: I1002 11:37:22.564252 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"546e3884-d904-4d23-853e-6855aee00e02","Type":"ContainerStarted","Data":"502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2"} Oct 02 11:37:22 crc kubenswrapper[4658]: I1002 11:37:22.761159 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7db8df9d95-jgkgn" podStartSLOduration=4.761133099 podStartE2EDuration="4.761133099s" podCreationTimestamp="2025-10-02 11:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:21.493028678 +0000 UTC m=+1122.384182245" watchObservedRunningTime="2025-10-02 11:37:22.761133099 +0000 UTC m=+1123.652286686" Oct 02 11:37:22 crc kubenswrapper[4658]: I1002 11:37:22.775581 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 02 11:37:22 crc kubenswrapper[4658]: I1002 11:37:22.775849 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api-log" containerID="cri-o://6ada99d9e53070da995cc306284cd32368a0608d42250281379a37775f7a3e2a" gracePeriod=30 Oct 02 11:37:22 crc kubenswrapper[4658]: I1002 11:37:22.776016 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api" containerID="cri-o://209d424c58b2dd299d676c2aa4377aed2a2319c6703c5804596f67401b167ef5" gracePeriod=30 Oct 02 11:37:23 crc kubenswrapper[4658]: I1002 11:37:23.594388 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"546e3884-d904-4d23-853e-6855aee00e02","Type":"ContainerStarted","Data":"9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0"} Oct 02 11:37:23 crc kubenswrapper[4658]: I1002 11:37:23.600214 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0f090b2-4ffc-4a27-b8ee-52a6912bf436","Type":"ContainerStarted","Data":"37ddc0308a6193f1af522a7a86165fb4b03469c586273799f8f7fb5e8aa5a151"} Oct 02 11:37:23 crc kubenswrapper[4658]: I1002 11:37:23.603124 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s6w77" event={"ID":"6378c687-5c50-4efd-8cc5-b7aa4ef82297","Type":"ContainerStarted","Data":"a51e06eefc7e7d1905e64b87fda225406a015cb1b488d1028ba7c11302952bad"} Oct 02 11:37:23 crc kubenswrapper[4658]: I1002 11:37:23.612415 4658 generic.go:334] "Generic (PLEG): container finished" podID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerID="6ada99d9e53070da995cc306284cd32368a0608d42250281379a37775f7a3e2a" exitCode=143 Oct 02 11:37:23 crc kubenswrapper[4658]: I1002 11:37:23.612494 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"74fa8060-b33d-406a-aaa0-386d23c8532b","Type":"ContainerDied","Data":"6ada99d9e53070da995cc306284cd32368a0608d42250281379a37775f7a3e2a"} Oct 02 11:37:23 crc kubenswrapper[4658]: I1002 11:37:23.634102 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.63408038 podStartE2EDuration="6.63408038s" podCreationTimestamp="2025-10-02 11:37:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:23.631368824 +0000 UTC m=+1124.522522401" watchObservedRunningTime="2025-10-02 11:37:23.63408038 +0000 UTC m=+1124.525233947" Oct 02 11:37:23 crc kubenswrapper[4658]: I1002 11:37:23.656828 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.656805687 podStartE2EDuration="5.656805687s" podCreationTimestamp="2025-10-02 11:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:23.652324623 +0000 UTC m=+1124.543478190" watchObservedRunningTime="2025-10-02 11:37:23.656805687 +0000 UTC m=+1124.547959264" Oct 02 11:37:23 crc kubenswrapper[4658]: I1002 11:37:23.698616 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-s6w77" podStartSLOduration=4.101630641 podStartE2EDuration="54.698588564s" podCreationTimestamp="2025-10-02 11:36:29 +0000 UTC" firstStartedPulling="2025-10-02 11:36:31.112248719 +0000 UTC m=+1072.003402286" lastFinishedPulling="2025-10-02 11:37:21.709206652 +0000 UTC m=+1122.600360209" observedRunningTime="2025-10-02 11:37:23.675859866 +0000 UTC m=+1124.567013433" watchObservedRunningTime="2025-10-02 11:37:23.698588564 +0000 UTC m=+1124.589742141" Oct 02 11:37:24 crc kubenswrapper[4658]: I1002 11:37:24.232458 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:24 crc kubenswrapper[4658]: I1002 11:37:24.294225 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-t2lfk"] Oct 02 11:37:24 crc kubenswrapper[4658]: I1002 11:37:24.294526 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" podUID="f667e839-3159-487f-af95-60818fdc1b84" containerName="dnsmasq-dns" containerID="cri-o://1f75b96569cb9daf5e807e6a5ae8e9e362f122f5c378832e24dbe4139b2b82a2" gracePeriod=10 Oct 02 11:37:24 crc kubenswrapper[4658]: I1002 11:37:24.628045 4658 generic.go:334] "Generic (PLEG): container finished" podID="f667e839-3159-487f-af95-60818fdc1b84" containerID="1f75b96569cb9daf5e807e6a5ae8e9e362f122f5c378832e24dbe4139b2b82a2" exitCode=0 Oct 02 11:37:24 crc kubenswrapper[4658]: I1002 11:37:24.628236 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" event={"ID":"f667e839-3159-487f-af95-60818fdc1b84","Type":"ContainerDied","Data":"1f75b96569cb9daf5e807e6a5ae8e9e362f122f5c378832e24dbe4139b2b82a2"} Oct 02 11:37:25 crc kubenswrapper[4658]: I1002 11:37:25.380897 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" podUID="f667e839-3159-487f-af95-60818fdc1b84" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Oct 02 11:37:25 crc kubenswrapper[4658]: I1002 11:37:25.653263 4658 generic.go:334] "Generic (PLEG): container finished" podID="057d8045-79f8-4f4d-9b29-ce1f517e0f94" containerID="0e571d20008829a47fd4be592c00b67333af9598c5d53ca600d02c8ff788d8e4" exitCode=0 Oct 02 11:37:25 crc kubenswrapper[4658]: I1002 11:37:25.653322 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d5ppn" event={"ID":"057d8045-79f8-4f4d-9b29-ce1f517e0f94","Type":"ContainerDied","Data":"0e571d20008829a47fd4be592c00b67333af9598c5d53ca600d02c8ff788d8e4"} Oct 02 11:37:26 crc kubenswrapper[4658]: I1002 11:37:26.280264 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.159:9322/\": read tcp 10.217.0.2:51802->10.217.0.159:9322: read: connection reset by peer" Oct 02 11:37:26 crc kubenswrapper[4658]: I1002 11:37:26.280649 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9322/\": read tcp 10.217.0.2:51816->10.217.0.159:9322: read: connection reset by peer" Oct 02 11:37:26 crc kubenswrapper[4658]: I1002 11:37:26.688556 4658 generic.go:334] "Generic (PLEG): container finished" podID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerID="209d424c58b2dd299d676c2aa4377aed2a2319c6703c5804596f67401b167ef5" exitCode=0 Oct 02 11:37:26 crc kubenswrapper[4658]: I1002 11:37:26.688973 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"74fa8060-b33d-406a-aaa0-386d23c8532b","Type":"ContainerDied","Data":"209d424c58b2dd299d676c2aa4377aed2a2319c6703c5804596f67401b167ef5"} Oct 02 11:37:27 crc kubenswrapper[4658]: I1002 11:37:27.429444 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:37:27 crc kubenswrapper[4658]: I1002 11:37:27.429515 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.165339 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.165744 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.209546 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.227305 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.436231 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.159:9322/\": dial tcp 10.217.0.159:9322: connect: connection refused" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.436330 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9322/\": dial tcp 10.217.0.159:9322: connect: connection refused" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.557891 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.559673 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.655162 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-522tx\" (UniqueName: \"kubernetes.io/projected/057d8045-79f8-4f4d-9b29-ce1f517e0f94-kube-api-access-522tx\") pod \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.655226 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-sb\") pod \"f667e839-3159-487f-af95-60818fdc1b84\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.655270 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-nb\") pod \"f667e839-3159-487f-af95-60818fdc1b84\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.655317 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-config\") pod \"f667e839-3159-487f-af95-60818fdc1b84\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.655518 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-combined-ca-bundle\") pod \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.655546 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-swift-storage-0\") pod \"f667e839-3159-487f-af95-60818fdc1b84\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.655569 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7dfv\" (UniqueName: \"kubernetes.io/projected/f667e839-3159-487f-af95-60818fdc1b84-kube-api-access-d7dfv\") pod \"f667e839-3159-487f-af95-60818fdc1b84\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.655585 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-svc\") pod \"f667e839-3159-487f-af95-60818fdc1b84\" (UID: \"f667e839-3159-487f-af95-60818fdc1b84\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.655613 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-db-sync-config-data\") pod \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\" (UID: \"057d8045-79f8-4f4d-9b29-ce1f517e0f94\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.664184 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.664222 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.664362 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057d8045-79f8-4f4d-9b29-ce1f517e0f94-kube-api-access-522tx" (OuterVolumeSpecName: "kube-api-access-522tx") pod "057d8045-79f8-4f4d-9b29-ce1f517e0f94" (UID: "057d8045-79f8-4f4d-9b29-ce1f517e0f94"). InnerVolumeSpecName "kube-api-access-522tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.688589 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "057d8045-79f8-4f4d-9b29-ce1f517e0f94" (UID: "057d8045-79f8-4f4d-9b29-ce1f517e0f94"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.701702 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f667e839-3159-487f-af95-60818fdc1b84-kube-api-access-d7dfv" (OuterVolumeSpecName: "kube-api-access-d7dfv") pod "f667e839-3159-487f-af95-60818fdc1b84" (UID: "f667e839-3159-487f-af95-60818fdc1b84"). InnerVolumeSpecName "kube-api-access-d7dfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.755251 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.757746 4658 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.757811 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-522tx\" (UniqueName: \"kubernetes.io/projected/057d8045-79f8-4f4d-9b29-ce1f517e0f94-kube-api-access-522tx\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.757824 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7dfv\" (UniqueName: \"kubernetes.io/projected/f667e839-3159-487f-af95-60818fdc1b84-kube-api-access-d7dfv\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.762426 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.784329 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d5ppn" event={"ID":"057d8045-79f8-4f4d-9b29-ce1f517e0f94","Type":"ContainerDied","Data":"e526562e6c0492349e4a75d7da2b7d1bb6f24b1d65810a000abb43a147cf61f3"} Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.784371 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e526562e6c0492349e4a75d7da2b7d1bb6f24b1d65810a000abb43a147cf61f3" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.784448 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d5ppn" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.790795 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.792224 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-t2lfk" event={"ID":"f667e839-3159-487f-af95-60818fdc1b84","Type":"ContainerDied","Data":"32607097163d44e1a6a5791b9a057abeb63dff4d2bf61a4850759596934a5f03"} Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.792303 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.792324 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.792337 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.792352 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.792369 4658 scope.go:117] "RemoveContainer" containerID="1f75b96569cb9daf5e807e6a5ae8e9e362f122f5c378832e24dbe4139b2b82a2" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.845584 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.848894 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-config" (OuterVolumeSpecName: "config") pod "f667e839-3159-487f-af95-60818fdc1b84" (UID: "f667e839-3159-487f-af95-60818fdc1b84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.850108 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f667e839-3159-487f-af95-60818fdc1b84" (UID: "f667e839-3159-487f-af95-60818fdc1b84"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.858622 4658 scope.go:117] "RemoveContainer" containerID="5795a2ce92c833a010aaac209d7283d07ce29363ddab019311b5845fda1ba01a" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.861811 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.861843 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.877427 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "057d8045-79f8-4f4d-9b29-ce1f517e0f94" (UID: "057d8045-79f8-4f4d-9b29-ce1f517e0f94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.882785 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f667e839-3159-487f-af95-60818fdc1b84" (UID: "f667e839-3159-487f-af95-60818fdc1b84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.897730 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f667e839-3159-487f-af95-60818fdc1b84" (UID: "f667e839-3159-487f-af95-60818fdc1b84"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.957857 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f667e839-3159-487f-af95-60818fdc1b84" (UID: "f667e839-3159-487f-af95-60818fdc1b84"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.962946 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-custom-prometheus-ca\") pod \"74fa8060-b33d-406a-aaa0-386d23c8532b\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.963001 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftk6v\" (UniqueName: \"kubernetes.io/projected/74fa8060-b33d-406a-aaa0-386d23c8532b-kube-api-access-ftk6v\") pod \"74fa8060-b33d-406a-aaa0-386d23c8532b\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.963039 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74fa8060-b33d-406a-aaa0-386d23c8532b-logs\") pod \"74fa8060-b33d-406a-aaa0-386d23c8532b\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.963218 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-config-data\") pod \"74fa8060-b33d-406a-aaa0-386d23c8532b\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.963253 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-combined-ca-bundle\") pod \"74fa8060-b33d-406a-aaa0-386d23c8532b\" (UID: \"74fa8060-b33d-406a-aaa0-386d23c8532b\") " Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.964034 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057d8045-79f8-4f4d-9b29-ce1f517e0f94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.964051 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.964064 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.964075 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667e839-3159-487f-af95-60818fdc1b84-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.965482 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74fa8060-b33d-406a-aaa0-386d23c8532b-logs" (OuterVolumeSpecName: "logs") pod "74fa8060-b33d-406a-aaa0-386d23c8532b" (UID: "74fa8060-b33d-406a-aaa0-386d23c8532b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:28 crc kubenswrapper[4658]: I1002 11:37:28.990632 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74fa8060-b33d-406a-aaa0-386d23c8532b-kube-api-access-ftk6v" (OuterVolumeSpecName: "kube-api-access-ftk6v") pod "74fa8060-b33d-406a-aaa0-386d23c8532b" (UID: "74fa8060-b33d-406a-aaa0-386d23c8532b"). InnerVolumeSpecName "kube-api-access-ftk6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.035550 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "74fa8060-b33d-406a-aaa0-386d23c8532b" (UID: "74fa8060-b33d-406a-aaa0-386d23c8532b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.076485 4658 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.076536 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftk6v\" (UniqueName: \"kubernetes.io/projected/74fa8060-b33d-406a-aaa0-386d23c8532b-kube-api-access-ftk6v\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.076553 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74fa8060-b33d-406a-aaa0-386d23c8532b-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.082693 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-config-data" (OuterVolumeSpecName: "config-data") pod "74fa8060-b33d-406a-aaa0-386d23c8532b" (UID: "74fa8060-b33d-406a-aaa0-386d23c8532b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.090757 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74fa8060-b33d-406a-aaa0-386d23c8532b" (UID: "74fa8060-b33d-406a-aaa0-386d23c8532b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.150218 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-t2lfk"] Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.159818 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-t2lfk"] Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.178218 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.180356 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74fa8060-b33d-406a-aaa0-386d23c8532b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.544200 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.597320 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-776f4bfd7b-cm7vj" podUID="02408c48-14d8-4a7b-8ebf-79fd2fa1b924" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.801784 4658 generic.go:334] "Generic (PLEG): container finished" podID="6378c687-5c50-4efd-8cc5-b7aa4ef82297" containerID="a51e06eefc7e7d1905e64b87fda225406a015cb1b488d1028ba7c11302952bad" exitCode=0 Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.801831 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s6w77" event={"ID":"6378c687-5c50-4efd-8cc5-b7aa4ef82297","Type":"ContainerDied","Data":"a51e06eefc7e7d1905e64b87fda225406a015cb1b488d1028ba7c11302952bad"} Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.803906 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"74fa8060-b33d-406a-aaa0-386d23c8532b","Type":"ContainerDied","Data":"55b7481862505aa398b26b4af18a702ced1cda0d0467d470ac7d7ae05dbd32f5"} Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.803920 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.803939 4658 scope.go:117] "RemoveContainer" containerID="209d424c58b2dd299d676c2aa4377aed2a2319c6703c5804596f67401b167ef5" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.808881 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerStarted","Data":"addab0ee3e29b566f6f1e77866cdc5a9366156f8316815205c33f7ae44eff9c5"} Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.831814 4658 scope.go:117] "RemoveContainer" containerID="6ada99d9e53070da995cc306284cd32368a0608d42250281379a37775f7a3e2a" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.853866 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.896433 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.921719 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-698b689fd7-9wp8g"] Oct 02 11:37:29 crc kubenswrapper[4658]: E1002 11:37:29.922150 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api-log" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.922165 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api-log" Oct 02 11:37:29 crc kubenswrapper[4658]: E1002 11:37:29.922174 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f667e839-3159-487f-af95-60818fdc1b84" containerName="init" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.922181 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="f667e839-3159-487f-af95-60818fdc1b84" containerName="init" Oct 02 11:37:29 crc kubenswrapper[4658]: E1002 11:37:29.922206 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f667e839-3159-487f-af95-60818fdc1b84" containerName="dnsmasq-dns" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.922214 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="f667e839-3159-487f-af95-60818fdc1b84" containerName="dnsmasq-dns" Oct 02 11:37:29 crc kubenswrapper[4658]: E1002 11:37:29.922226 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.922233 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api" Oct 02 11:37:29 crc kubenswrapper[4658]: E1002 11:37:29.922250 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057d8045-79f8-4f4d-9b29-ce1f517e0f94" containerName="barbican-db-sync" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.922258 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="057d8045-79f8-4f4d-9b29-ce1f517e0f94" containerName="barbican-db-sync" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.922469 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="057d8045-79f8-4f4d-9b29-ce1f517e0f94" containerName="barbican-db-sync" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.922478 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="f667e839-3159-487f-af95-60818fdc1b84" containerName="dnsmasq-dns" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.922491 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.922509 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" containerName="watcher-api-log" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.923481 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.945447 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.945589 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 11:37:29 crc kubenswrapper[4658]: I1002 11:37:29.945684 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mwbdb" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.117457 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74fa8060-b33d-406a-aaa0-386d23c8532b" path="/var/lib/kubelet/pods/74fa8060-b33d-406a-aaa0-386d23c8532b/volumes" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.118117 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f667e839-3159-487f-af95-60818fdc1b84" path="/var/lib/kubelet/pods/f667e839-3159-487f-af95-60818fdc1b84/volumes" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.118729 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.141232 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.141366 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.196706 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.201791 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.201865 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.212254 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5fc61f1-3fdf-430c-890e-4e220859285b-logs\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.212342 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5fc61f1-3fdf-430c-890e-4e220859285b-config-data-custom\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.212376 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5fc61f1-3fdf-430c-890e-4e220859285b-combined-ca-bundle\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.212400 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5fc61f1-3fdf-430c-890e-4e220859285b-config-data\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.212421 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnkzv\" (UniqueName: \"kubernetes.io/projected/e5fc61f1-3fdf-430c-890e-4e220859285b-kube-api-access-vnkzv\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.235760 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-54ff5bbf66-pmxfv"] Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.237684 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.259516 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.271953 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-698b689fd7-9wp8g"] Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.303531 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-54ff5bbf66-pmxfv"] Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.317599 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5fc61f1-3fdf-430c-890e-4e220859285b-logs\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.317675 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed9f1355-f34e-479c-8030-c2848860beb6-config-data-custom\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.317720 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-public-tls-certs\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.317844 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed9f1355-f34e-479c-8030-c2848860beb6-logs\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.317953 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9f1355-f34e-479c-8030-c2848860beb6-combined-ca-bundle\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318034 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j56ll\" (UniqueName: \"kubernetes.io/projected/a963ca85-eeb4-4678-849f-b5b980b36091-kube-api-access-j56ll\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318070 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a963ca85-eeb4-4678-849f-b5b980b36091-logs\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318110 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-config-data\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318135 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5fc61f1-3fdf-430c-890e-4e220859285b-logs\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318153 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5fc61f1-3fdf-430c-890e-4e220859285b-config-data-custom\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318256 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdd9n\" (UniqueName: \"kubernetes.io/projected/ed9f1355-f34e-479c-8030-c2848860beb6-kube-api-access-kdd9n\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318290 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5fc61f1-3fdf-430c-890e-4e220859285b-combined-ca-bundle\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318356 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5fc61f1-3fdf-430c-890e-4e220859285b-config-data\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318384 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9f1355-f34e-479c-8030-c2848860beb6-config-data\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318416 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnkzv\" (UniqueName: \"kubernetes.io/projected/e5fc61f1-3fdf-430c-890e-4e220859285b-kube-api-access-vnkzv\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318470 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318528 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.318637 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.333461 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-8qrj6"] Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.335287 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.339575 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5fc61f1-3fdf-430c-890e-4e220859285b-combined-ca-bundle\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.344990 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5fc61f1-3fdf-430c-890e-4e220859285b-config-data-custom\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.361218 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-8qrj6"] Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.371890 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5fc61f1-3fdf-430c-890e-4e220859285b-config-data\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.372008 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnkzv\" (UniqueName: \"kubernetes.io/projected/e5fc61f1-3fdf-430c-890e-4e220859285b-kube-api-access-vnkzv\") pod \"barbican-worker-698b689fd7-9wp8g\" (UID: \"e5fc61f1-3fdf-430c-890e-4e220859285b\") " pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.400083 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b994d9586-748rf"] Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.401888 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.404455 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420450 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420495 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420518 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420555 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420589 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420606 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420636 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420674 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed9f1355-f34e-479c-8030-c2848860beb6-config-data-custom\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420692 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-public-tls-certs\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420708 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed9f1355-f34e-479c-8030-c2848860beb6-logs\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420732 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f29c8\" (UniqueName: \"kubernetes.io/projected/591af15c-9d7a-4cc7-81c3-28a531a328e6-kube-api-access-f29c8\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420753 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9f1355-f34e-479c-8030-c2848860beb6-combined-ca-bundle\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420787 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j56ll\" (UniqueName: \"kubernetes.io/projected/a963ca85-eeb4-4678-849f-b5b980b36091-kube-api-access-j56ll\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420801 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a963ca85-eeb4-4678-849f-b5b980b36091-logs\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420824 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-config-data\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420847 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-config\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420876 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdd9n\" (UniqueName: \"kubernetes.io/projected/ed9f1355-f34e-479c-8030-c2848860beb6-kube-api-access-kdd9n\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.420907 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9f1355-f34e-479c-8030-c2848860beb6-config-data\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.430916 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed9f1355-f34e-479c-8030-c2848860beb6-logs\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.437428 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b994d9586-748rf"] Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.437694 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a963ca85-eeb4-4678-849f-b5b980b36091-logs\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.440807 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9f1355-f34e-479c-8030-c2848860beb6-config-data\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.442404 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-config-data\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.443884 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-public-tls-certs\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.444833 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.447197 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.454849 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9f1355-f34e-479c-8030-c2848860beb6-combined-ca-bundle\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.458182 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a963ca85-eeb4-4678-849f-b5b980b36091-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.464582 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j56ll\" (UniqueName: \"kubernetes.io/projected/a963ca85-eeb4-4678-849f-b5b980b36091-kube-api-access-j56ll\") pod \"watcher-api-0\" (UID: \"a963ca85-eeb4-4678-849f-b5b980b36091\") " pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.466598 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed9f1355-f34e-479c-8030-c2848860beb6-config-data-custom\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.471443 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdd9n\" (UniqueName: \"kubernetes.io/projected/ed9f1355-f34e-479c-8030-c2848860beb6-kube-api-access-kdd9n\") pod \"barbican-keystone-listener-54ff5bbf66-pmxfv\" (UID: \"ed9f1355-f34e-479c-8030-c2848860beb6\") " pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522362 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pdt2\" (UniqueName: \"kubernetes.io/projected/39084258-a9f4-4b1e-9a7c-d0e622c39479-kube-api-access-5pdt2\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522455 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data-custom\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522497 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f29c8\" (UniqueName: \"kubernetes.io/projected/591af15c-9d7a-4cc7-81c3-28a531a328e6-kube-api-access-f29c8\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522569 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-config\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522596 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39084258-a9f4-4b1e-9a7c-d0e622c39479-logs\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522759 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-combined-ca-bundle\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522806 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522838 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522901 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522934 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.522963 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.524228 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.525110 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.525446 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.525447 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.526099 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-config\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.542191 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f29c8\" (UniqueName: \"kubernetes.io/projected/591af15c-9d7a-4cc7-81c3-28a531a328e6-kube-api-access-f29c8\") pod \"dnsmasq-dns-75c8ddd69c-8qrj6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.565929 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.607570 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.624464 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.624519 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pdt2\" (UniqueName: \"kubernetes.io/projected/39084258-a9f4-4b1e-9a7c-d0e622c39479-kube-api-access-5pdt2\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.624553 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data-custom\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.624605 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39084258-a9f4-4b1e-9a7c-d0e622c39479-logs\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.624658 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-combined-ca-bundle\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.628628 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data-custom\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.628727 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39084258-a9f4-4b1e-9a7c-d0e622c39479-logs\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.628779 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-combined-ca-bundle\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.637443 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-698b689fd7-9wp8g" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.644098 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.649126 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pdt2\" (UniqueName: \"kubernetes.io/projected/39084258-a9f4-4b1e-9a7c-d0e622c39479-kube-api-access-5pdt2\") pod \"barbican-api-6b994d9586-748rf\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.665629 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.682836 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.889822 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.890109 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.891158 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:37:30 crc kubenswrapper[4658]: I1002 11:37:30.891180 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.191246 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.503417 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s6w77" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.515525 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-54ff5bbf66-pmxfv"] Oct 02 11:37:31 crc kubenswrapper[4658]: W1002 11:37:31.527960 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded9f1355_f34e_479c_8030_c2848860beb6.slice/crio-35d02ed1860b45c58330d2e9de4e0530fc8388e3f8061a95c8f61cf6dc297dd7 WatchSource:0}: Error finding container 35d02ed1860b45c58330d2e9de4e0530fc8388e3f8061a95c8f61cf6dc297dd7: Status 404 returned error can't find the container with id 35d02ed1860b45c58330d2e9de4e0530fc8388e3f8061a95c8f61cf6dc297dd7 Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.677016 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-db-sync-config-data\") pod \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.677463 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-scripts\") pod \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.677502 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf8r5\" (UniqueName: \"kubernetes.io/projected/6378c687-5c50-4efd-8cc5-b7aa4ef82297-kube-api-access-lf8r5\") pod \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.677542 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-config-data\") pod \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.677571 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6378c687-5c50-4efd-8cc5-b7aa4ef82297-etc-machine-id\") pod \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.677619 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-combined-ca-bundle\") pod \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\" (UID: \"6378c687-5c50-4efd-8cc5-b7aa4ef82297\") " Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.679953 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6378c687-5c50-4efd-8cc5-b7aa4ef82297-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6378c687-5c50-4efd-8cc5-b7aa4ef82297" (UID: "6378c687-5c50-4efd-8cc5-b7aa4ef82297"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.693363 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6378c687-5c50-4efd-8cc5-b7aa4ef82297-kube-api-access-lf8r5" (OuterVolumeSpecName: "kube-api-access-lf8r5") pod "6378c687-5c50-4efd-8cc5-b7aa4ef82297" (UID: "6378c687-5c50-4efd-8cc5-b7aa4ef82297"). InnerVolumeSpecName "kube-api-access-lf8r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.705517 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-scripts" (OuterVolumeSpecName: "scripts") pod "6378c687-5c50-4efd-8cc5-b7aa4ef82297" (UID: "6378c687-5c50-4efd-8cc5-b7aa4ef82297"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.724048 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6378c687-5c50-4efd-8cc5-b7aa4ef82297" (UID: "6378c687-5c50-4efd-8cc5-b7aa4ef82297"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.762398 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6378c687-5c50-4efd-8cc5-b7aa4ef82297" (UID: "6378c687-5c50-4efd-8cc5-b7aa4ef82297"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.784199 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.784230 4658 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.784239 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.784247 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf8r5\" (UniqueName: \"kubernetes.io/projected/6378c687-5c50-4efd-8cc5-b7aa4ef82297-kube-api-access-lf8r5\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.784256 4658 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6378c687-5c50-4efd-8cc5-b7aa4ef82297-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.816350 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-config-data" (OuterVolumeSpecName: "config-data") pod "6378c687-5c50-4efd-8cc5-b7aa4ef82297" (UID: "6378c687-5c50-4efd-8cc5-b7aa4ef82297"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.886414 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6378c687-5c50-4efd-8cc5-b7aa4ef82297-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.921706 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" event={"ID":"ed9f1355-f34e-479c-8030-c2848860beb6","Type":"ContainerStarted","Data":"35d02ed1860b45c58330d2e9de4e0530fc8388e3f8061a95c8f61cf6dc297dd7"} Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.941856 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s6w77" event={"ID":"6378c687-5c50-4efd-8cc5-b7aa4ef82297","Type":"ContainerDied","Data":"75366299e74195ce02f15b2b31a8e505d22cd8715729074a9d788d1634beb1d0"} Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.941907 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75366299e74195ce02f15b2b31a8e505d22cd8715729074a9d788d1634beb1d0" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.941914 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s6w77" Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.946808 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a963ca85-eeb4-4678-849f-b5b980b36091","Type":"ContainerStarted","Data":"5c0d84ee042ee6b5bb8a3a0b95af58efe304c4639d58dddf9d5388889088ba0e"} Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.946854 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a963ca85-eeb4-4678-849f-b5b980b36091","Type":"ContainerStarted","Data":"91c51c7999e9e6909b82981ee78a21c7419ef8f137876797b7f5cbb9220629c0"} Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.977111 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-8qrj6"] Oct 02 11:37:31 crc kubenswrapper[4658]: I1002 11:37:31.988341 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b994d9586-748rf"] Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.027058 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-698b689fd7-9wp8g"] Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.168406 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:37:32 crc kubenswrapper[4658]: E1002 11:37:32.169510 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6378c687-5c50-4efd-8cc5-b7aa4ef82297" containerName="cinder-db-sync" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.169542 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6378c687-5c50-4efd-8cc5-b7aa4ef82297" containerName="cinder-db-sync" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.170118 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="6378c687-5c50-4efd-8cc5-b7aa4ef82297" containerName="cinder-db-sync" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.209108 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.229104 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.234794 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.235607 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.235892 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.236868 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2d764" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.287373 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-8qrj6"] Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.314662 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.314720 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.314839 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wxjn\" (UniqueName: \"kubernetes.io/projected/5b027647-05c5-4977-a5b8-498cc9cc5dc1-kube-api-access-5wxjn\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.314882 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b027647-05c5-4977-a5b8-498cc9cc5dc1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.314918 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-scripts\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.314957 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.320564 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-kw9hk"] Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.322608 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.341731 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-kw9hk"] Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.388554 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.395121 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.401024 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.406287 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.426739 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b027647-05c5-4977-a5b8-498cc9cc5dc1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.427042 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-scripts\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.428269 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.428429 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-svc\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.427195 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b027647-05c5-4977-a5b8-498cc9cc5dc1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.428695 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.428844 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.429093 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.429609 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.430258 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-config\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.430517 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b5bn\" (UniqueName: \"kubernetes.io/projected/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-kube-api-access-8b5bn\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.430796 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.431287 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wxjn\" (UniqueName: \"kubernetes.io/projected/5b027647-05c5-4977-a5b8-498cc9cc5dc1-kube-api-access-5wxjn\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.445570 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-scripts\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.445896 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.446901 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.449430 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.479729 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wxjn\" (UniqueName: \"kubernetes.io/projected/5b027647-05c5-4977-a5b8-498cc9cc5dc1-kube-api-access-5wxjn\") pod \"cinder-scheduler-0\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.541945 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-scripts\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.541995 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9e4f67c-d66b-41ba-9bec-920e299e7110-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542023 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b5bn\" (UniqueName: \"kubernetes.io/projected/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-kube-api-access-8b5bn\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542041 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542070 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542091 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48kc6\" (UniqueName: \"kubernetes.io/projected/b9e4f67c-d66b-41ba-9bec-920e299e7110-kube-api-access-48kc6\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542104 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542129 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e4f67c-d66b-41ba-9bec-920e299e7110-logs\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542191 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542212 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542240 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-svc\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542271 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.542315 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-config\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.543067 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-config\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.543865 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.548380 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.551928 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-svc\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.555762 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.574319 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.599555 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b5bn\" (UniqueName: \"kubernetes.io/projected/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-kube-api-access-8b5bn\") pod \"dnsmasq-dns-5784cf869f-kw9hk\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.644223 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.644339 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-scripts\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.644364 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9e4f67c-d66b-41ba-9bec-920e299e7110-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.644382 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.644416 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48kc6\" (UniqueName: \"kubernetes.io/projected/b9e4f67c-d66b-41ba-9bec-920e299e7110-kube-api-access-48kc6\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.644434 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.644457 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e4f67c-d66b-41ba-9bec-920e299e7110-logs\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.644786 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e4f67c-d66b-41ba-9bec-920e299e7110-logs\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.644885 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9e4f67c-d66b-41ba-9bec-920e299e7110-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.652754 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.655515 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.678851 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.685918 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.688207 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-scripts\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.693172 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48kc6\" (UniqueName: \"kubernetes.io/projected/b9e4f67c-d66b-41ba-9bec-920e299e7110-kube-api-access-48kc6\") pod \"cinder-api-0\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " pod="openstack/cinder-api-0" Oct 02 11:37:32 crc kubenswrapper[4658]: I1002 11:37:32.884525 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.008051 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-698b689fd7-9wp8g" event={"ID":"e5fc61f1-3fdf-430c-890e-4e220859285b","Type":"ContainerStarted","Data":"33295f2b31488741dc16818b2417ba552746fc4efc9276dfc6dcfe2e702eb2ce"} Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.023013 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" event={"ID":"591af15c-9d7a-4cc7-81c3-28a531a328e6","Type":"ContainerStarted","Data":"7b2eba7d41a5db59990d28447d30ce375dc525fb17a3b307e2892b657282e208"} Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.072258 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a963ca85-eeb4-4678-849f-b5b980b36091","Type":"ContainerStarted","Data":"9140adbfebef6fef51a52b12fc13d2a13ff0fc97a0383202106d8cb53263830f"} Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.073170 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.083436 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b994d9586-748rf" event={"ID":"39084258-a9f4-4b1e-9a7c-d0e622c39479","Type":"ContainerStarted","Data":"68b0858514747e0d4500cf25af1cfbf261b92bd552375fafbecb4cfcd700f6af"} Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.083493 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b994d9586-748rf" event={"ID":"39084258-a9f4-4b1e-9a7c-d0e622c39479","Type":"ContainerStarted","Data":"2be84dd4e6e1a0de443a1d48da3b43c542672b9b966a41d2d8116f97baa8a2c5"} Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.101221 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.101200738 podStartE2EDuration="4.101200738s" podCreationTimestamp="2025-10-02 11:37:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:33.099660099 +0000 UTC m=+1133.990813686" watchObservedRunningTime="2025-10-02 11:37:33.101200738 +0000 UTC m=+1133.992354315" Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.298766 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.519666 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-kw9hk"] Oct 02 11:37:33 crc kubenswrapper[4658]: I1002 11:37:33.538803 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.104647 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9e4f67c-d66b-41ba-9bec-920e299e7110","Type":"ContainerStarted","Data":"e5738d51e2f2cf1d3497b15eec2de5c1599a8b6b8308c6dce80a9c9ef1b30331"} Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.107643 4658 generic.go:334] "Generic (PLEG): container finished" podID="ff74bfb7-1171-47ce-acb3-df2b35d0ca20" containerID="4755c097e2730a1e59f62856b8223dd41a73976014ca7b625ffb481dbf72a05e" exitCode=0 Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.108115 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" event={"ID":"ff74bfb7-1171-47ce-acb3-df2b35d0ca20","Type":"ContainerDied","Data":"4755c097e2730a1e59f62856b8223dd41a73976014ca7b625ffb481dbf72a05e"} Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.108149 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" event={"ID":"ff74bfb7-1171-47ce-acb3-df2b35d0ca20","Type":"ContainerStarted","Data":"5023168be63d57ee4afc23a614d4bc6d54ca9f19ccf0cc1d153982351d00694b"} Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.118544 4658 generic.go:334] "Generic (PLEG): container finished" podID="591af15c-9d7a-4cc7-81c3-28a531a328e6" containerID="0a704bdb0fe836a2dc89e2b82aa73406db10ebc07387e420cf7eb82552161473" exitCode=0 Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.118792 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" event={"ID":"591af15c-9d7a-4cc7-81c3-28a531a328e6","Type":"ContainerDied","Data":"0a704bdb0fe836a2dc89e2b82aa73406db10ebc07387e420cf7eb82552161473"} Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.132900 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b994d9586-748rf" event={"ID":"39084258-a9f4-4b1e-9a7c-d0e622c39479","Type":"ContainerStarted","Data":"564592b1e8dfa07ab07844b9184cf713b55f93e26bc4041812e4b1c9a6fb5b8b"} Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.133443 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.133598 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.148955 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b027647-05c5-4977-a5b8-498cc9cc5dc1","Type":"ContainerStarted","Data":"1b8b6f0da6a98eacb9d62e860a4df1c10eb10910b16db7538f930e583c9ca46d"} Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.204852 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b994d9586-748rf" podStartSLOduration=4.204826628 podStartE2EDuration="4.204826628s" podCreationTimestamp="2025-10-02 11:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:34.196662097 +0000 UTC m=+1135.087815684" watchObservedRunningTime="2025-10-02 11:37:34.204826628 +0000 UTC m=+1135.095980205" Oct 02 11:37:34 crc kubenswrapper[4658]: I1002 11:37:34.850359 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.027953 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-sb\") pod \"591af15c-9d7a-4cc7-81c3-28a531a328e6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.028088 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-nb\") pod \"591af15c-9d7a-4cc7-81c3-28a531a328e6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.028126 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-swift-storage-0\") pod \"591af15c-9d7a-4cc7-81c3-28a531a328e6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.028215 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-svc\") pod \"591af15c-9d7a-4cc7-81c3-28a531a328e6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.028246 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f29c8\" (UniqueName: \"kubernetes.io/projected/591af15c-9d7a-4cc7-81c3-28a531a328e6-kube-api-access-f29c8\") pod \"591af15c-9d7a-4cc7-81c3-28a531a328e6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.028270 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-config\") pod \"591af15c-9d7a-4cc7-81c3-28a531a328e6\" (UID: \"591af15c-9d7a-4cc7-81c3-28a531a328e6\") " Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.048983 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591af15c-9d7a-4cc7-81c3-28a531a328e6-kube-api-access-f29c8" (OuterVolumeSpecName: "kube-api-access-f29c8") pod "591af15c-9d7a-4cc7-81c3-28a531a328e6" (UID: "591af15c-9d7a-4cc7-81c3-28a531a328e6"). InnerVolumeSpecName "kube-api-access-f29c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.097947 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "591af15c-9d7a-4cc7-81c3-28a531a328e6" (UID: "591af15c-9d7a-4cc7-81c3-28a531a328e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.130287 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f29c8\" (UniqueName: \"kubernetes.io/projected/591af15c-9d7a-4cc7-81c3-28a531a328e6-kube-api-access-f29c8\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.130328 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.130952 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "591af15c-9d7a-4cc7-81c3-28a531a328e6" (UID: "591af15c-9d7a-4cc7-81c3-28a531a328e6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.131510 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-config" (OuterVolumeSpecName: "config") pod "591af15c-9d7a-4cc7-81c3-28a531a328e6" (UID: "591af15c-9d7a-4cc7-81c3-28a531a328e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.132128 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "591af15c-9d7a-4cc7-81c3-28a531a328e6" (UID: "591af15c-9d7a-4cc7-81c3-28a531a328e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.134723 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "591af15c-9d7a-4cc7-81c3-28a531a328e6" (UID: "591af15c-9d7a-4cc7-81c3-28a531a328e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.172527 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" event={"ID":"ff74bfb7-1171-47ce-acb3-df2b35d0ca20","Type":"ContainerStarted","Data":"95cb18122631cb565b73c262d44f302c384ed175ad61263ef04eb1ab2006875c"} Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.173523 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.188830 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" event={"ID":"591af15c-9d7a-4cc7-81c3-28a531a328e6","Type":"ContainerDied","Data":"7b2eba7d41a5db59990d28447d30ce375dc525fb17a3b307e2892b657282e208"} Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.188881 4658 scope.go:117] "RemoveContainer" containerID="0a704bdb0fe836a2dc89e2b82aa73406db10ebc07387e420cf7eb82552161473" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.189004 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-8qrj6" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.197037 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" podStartSLOduration=3.197020344 podStartE2EDuration="3.197020344s" podCreationTimestamp="2025-10-02 11:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:35.195791744 +0000 UTC m=+1136.086945321" watchObservedRunningTime="2025-10-02 11:37:35.197020344 +0000 UTC m=+1136.088173921" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.201465 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9e4f67c-d66b-41ba-9bec-920e299e7110","Type":"ContainerStarted","Data":"6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633"} Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.235075 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.235110 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.235122 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.235133 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/591af15c-9d7a-4cc7-81c3-28a531a328e6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.251110 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.251228 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.259761 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.279691 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-8qrj6"] Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.289305 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.289398 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.298177 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-8qrj6"] Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.561123 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.575191 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.575323 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:37:35 crc kubenswrapper[4658]: I1002 11:37:35.969630 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591af15c-9d7a-4cc7-81c3-28a531a328e6" path="/var/lib/kubelet/pods/591af15c-9d7a-4cc7-81c3-28a531a328e6/volumes" Oct 02 11:37:36 crc kubenswrapper[4658]: I1002 11:37:36.892193 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.313603 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cbc95469d-r9kbr"] Oct 02 11:37:37 crc kubenswrapper[4658]: E1002 11:37:37.314364 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591af15c-9d7a-4cc7-81c3-28a531a328e6" containerName="init" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.314381 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="591af15c-9d7a-4cc7-81c3-28a531a328e6" containerName="init" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.314621 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="591af15c-9d7a-4cc7-81c3-28a531a328e6" containerName="init" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.315932 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.323822 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.324091 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.362364 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cbc95469d-r9kbr"] Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.383570 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-internal-tls-certs\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.383972 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpfmw\" (UniqueName: \"kubernetes.io/projected/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-kube-api-access-hpfmw\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.384005 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-logs\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.384050 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-config-data\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.384088 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-config-data-custom\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.384158 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-public-tls-certs\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.384197 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-combined-ca-bundle\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.487667 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-combined-ca-bundle\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.487827 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-internal-tls-certs\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.487934 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpfmw\" (UniqueName: \"kubernetes.io/projected/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-kube-api-access-hpfmw\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.487963 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-logs\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.488029 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-config-data\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.488082 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-config-data-custom\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.488186 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-public-tls-certs\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.489409 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-logs\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.498235 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-config-data-custom\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.500718 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-public-tls-certs\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.502508 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-combined-ca-bundle\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.504421 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-config-data\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.511956 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-internal-tls-certs\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.524982 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpfmw\" (UniqueName: \"kubernetes.io/projected/456bb611-ccbc-4d1b-94bf-2ceb7d8345e3-kube-api-access-hpfmw\") pod \"barbican-api-7cbc95469d-r9kbr\" (UID: \"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3\") " pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:37 crc kubenswrapper[4658]: I1002 11:37:37.665729 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:38 crc kubenswrapper[4658]: I1002 11:37:38.113584 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a963ca85-eeb4-4678-849f-b5b980b36091" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.173:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:37:38 crc kubenswrapper[4658]: I1002 11:37:38.247740 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 02 11:37:39 crc kubenswrapper[4658]: I1002 11:37:39.315807 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:39 crc kubenswrapper[4658]: I1002 11:37:39.544376 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Oct 02 11:37:39 crc kubenswrapper[4658]: I1002 11:37:39.544466 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:37:39 crc kubenswrapper[4658]: I1002 11:37:39.545247 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"6d45f089b45e50f886b377a7177e755f763adec478d0b95d9b7dd867cd3a61a8"} pod="openstack/horizon-6dbf7b8b8b-kj6xr" containerMessage="Container horizon failed startup probe, will be restarted" Oct 02 11:37:39 crc kubenswrapper[4658]: I1002 11:37:39.545285 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" containerID="cri-o://6d45f089b45e50f886b377a7177e755f763adec478d0b95d9b7dd867cd3a61a8" gracePeriod=30 Oct 02 11:37:39 crc kubenswrapper[4658]: I1002 11:37:39.597191 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-776f4bfd7b-cm7vj" podUID="02408c48-14d8-4a7b-8ebf-79fd2fa1b924" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Oct 02 11:37:39 crc kubenswrapper[4658]: I1002 11:37:39.597320 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:37:39 crc kubenswrapper[4658]: I1002 11:37:39.598494 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"b902a68948536244db8695a6e4dd9a6e647d1be696ee4baa78124a8553dfffab"} pod="openstack/horizon-776f4bfd7b-cm7vj" containerMessage="Container horizon failed startup probe, will be restarted" Oct 02 11:37:39 crc kubenswrapper[4658]: I1002 11:37:39.598559 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-776f4bfd7b-cm7vj" podUID="02408c48-14d8-4a7b-8ebf-79fd2fa1b924" containerName="horizon" containerID="cri-o://b902a68948536244db8695a6e4dd9a6e647d1be696ee4baa78124a8553dfffab" gracePeriod=30 Oct 02 11:37:40 crc kubenswrapper[4658]: I1002 11:37:40.286780 4658 generic.go:334] "Generic (PLEG): container finished" podID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerID="8600e21cfa45a23fc9cec0c62d4292779da66d2c8bb391a471760b22b24a0407" exitCode=137 Oct 02 11:37:40 crc kubenswrapper[4658]: I1002 11:37:40.286825 4658 generic.go:334] "Generic (PLEG): container finished" podID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerID="92c61fede454e4d930a8b2a7c7439bd70f8ca71bee1ff88d0510a17803277073" exitCode=137 Oct 02 11:37:40 crc kubenswrapper[4658]: I1002 11:37:40.286840 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78b685455c-5zn4s" event={"ID":"822259c6-fea2-44cb-9a09-d6415a92e71e","Type":"ContainerDied","Data":"8600e21cfa45a23fc9cec0c62d4292779da66d2c8bb391a471760b22b24a0407"} Oct 02 11:37:40 crc kubenswrapper[4658]: I1002 11:37:40.286888 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78b685455c-5zn4s" event={"ID":"822259c6-fea2-44cb-9a09-d6415a92e71e","Type":"ContainerDied","Data":"92c61fede454e4d930a8b2a7c7439bd70f8ca71bee1ff88d0510a17803277073"} Oct 02 11:37:40 crc kubenswrapper[4658]: I1002 11:37:40.566690 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 02 11:37:40 crc kubenswrapper[4658]: I1002 11:37:40.589342 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 02 11:37:41 crc kubenswrapper[4658]: I1002 11:37:41.308419 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 02 11:37:42 crc kubenswrapper[4658]: I1002 11:37:42.346692 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:42 crc kubenswrapper[4658]: I1002 11:37:42.352831 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:42 crc kubenswrapper[4658]: I1002 11:37:42.356223 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6b994d9586-748rf" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:37:42 crc kubenswrapper[4658]: I1002 11:37:42.682498 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:37:42 crc kubenswrapper[4658]: I1002 11:37:42.748353 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-dt2rk"] Oct 02 11:37:42 crc kubenswrapper[4658]: I1002 11:37:42.748591 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" podUID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" containerName="dnsmasq-dns" containerID="cri-o://8b8978037794019d6beb72573549347d8a8c005dcddd7b968afa1939cd31e5b2" gracePeriod=10 Oct 02 11:37:43 crc kubenswrapper[4658]: I1002 11:37:43.330161 4658 generic.go:334] "Generic (PLEG): container finished" podID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" containerID="8b8978037794019d6beb72573549347d8a8c005dcddd7b968afa1939cd31e5b2" exitCode=0 Oct 02 11:37:43 crc kubenswrapper[4658]: I1002 11:37:43.330722 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" event={"ID":"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e","Type":"ContainerDied","Data":"8b8978037794019d6beb72573549347d8a8c005dcddd7b968afa1939cd31e5b2"} Oct 02 11:37:43 crc kubenswrapper[4658]: I1002 11:37:43.749396 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6989c4ffd5-z7vdb" Oct 02 11:37:43 crc kubenswrapper[4658]: I1002 11:37:43.824479 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-565ccbd57b-kt62s"] Oct 02 11:37:43 crc kubenswrapper[4658]: I1002 11:37:43.824734 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-565ccbd57b-kt62s" podUID="218ca390-9242-4dba-8899-0852cbc26bea" containerName="neutron-api" containerID="cri-o://a87b6456f4a87208b841bf7062661b0b0d8c6155e0207147a6e0c052f784ccd0" gracePeriod=30 Oct 02 11:37:43 crc kubenswrapper[4658]: I1002 11:37:43.825427 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-565ccbd57b-kt62s" podUID="218ca390-9242-4dba-8899-0852cbc26bea" containerName="neutron-httpd" containerID="cri-o://631d442b4a9a3c355372f7c5b9ab59e21341e6a46715ca6efb040d0d65ee0f67" gracePeriod=30 Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.348374 4658 generic.go:334] "Generic (PLEG): container finished" podID="218ca390-9242-4dba-8899-0852cbc26bea" containerID="631d442b4a9a3c355372f7c5b9ab59e21341e6a46715ca6efb040d0d65ee0f67" exitCode=0 Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.348395 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-565ccbd57b-kt62s" event={"ID":"218ca390-9242-4dba-8899-0852cbc26bea","Type":"ContainerDied","Data":"631d442b4a9a3c355372f7c5b9ab59e21341e6a46715ca6efb040d0d65ee0f67"} Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.717444 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.727492 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.869934 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-config\") pod \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.869978 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-nb\") pod \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870002 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822259c6-fea2-44cb-9a09-d6415a92e71e-logs\") pod \"822259c6-fea2-44cb-9a09-d6415a92e71e\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870022 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gczjd\" (UniqueName: \"kubernetes.io/projected/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-kube-api-access-gczjd\") pod \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870055 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-swift-storage-0\") pod \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870103 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-svc\") pod \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870131 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df5rt\" (UniqueName: \"kubernetes.io/projected/822259c6-fea2-44cb-9a09-d6415a92e71e-kube-api-access-df5rt\") pod \"822259c6-fea2-44cb-9a09-d6415a92e71e\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870446 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-sb\") pod \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\" (UID: \"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870782 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/822259c6-fea2-44cb-9a09-d6415a92e71e-horizon-secret-key\") pod \"822259c6-fea2-44cb-9a09-d6415a92e71e\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870790 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822259c6-fea2-44cb-9a09-d6415a92e71e-logs" (OuterVolumeSpecName: "logs") pod "822259c6-fea2-44cb-9a09-d6415a92e71e" (UID: "822259c6-fea2-44cb-9a09-d6415a92e71e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870817 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-config-data\") pod \"822259c6-fea2-44cb-9a09-d6415a92e71e\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.870847 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-scripts\") pod \"822259c6-fea2-44cb-9a09-d6415a92e71e\" (UID: \"822259c6-fea2-44cb-9a09-d6415a92e71e\") " Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.871333 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822259c6-fea2-44cb-9a09-d6415a92e71e-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.874490 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-kube-api-access-gczjd" (OuterVolumeSpecName: "kube-api-access-gczjd") pod "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" (UID: "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e"). InnerVolumeSpecName "kube-api-access-gczjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.875418 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822259c6-fea2-44cb-9a09-d6415a92e71e-kube-api-access-df5rt" (OuterVolumeSpecName: "kube-api-access-df5rt") pod "822259c6-fea2-44cb-9a09-d6415a92e71e" (UID: "822259c6-fea2-44cb-9a09-d6415a92e71e"). InnerVolumeSpecName "kube-api-access-df5rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.876246 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822259c6-fea2-44cb-9a09-d6415a92e71e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "822259c6-fea2-44cb-9a09-d6415a92e71e" (UID: "822259c6-fea2-44cb-9a09-d6415a92e71e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.896685 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-scripts" (OuterVolumeSpecName: "scripts") pod "822259c6-fea2-44cb-9a09-d6415a92e71e" (UID: "822259c6-fea2-44cb-9a09-d6415a92e71e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.913136 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-config-data" (OuterVolumeSpecName: "config-data") pod "822259c6-fea2-44cb-9a09-d6415a92e71e" (UID: "822259c6-fea2-44cb-9a09-d6415a92e71e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.926957 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" (UID: "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.928804 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" (UID: "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.935977 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" (UID: "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.944898 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-config" (OuterVolumeSpecName: "config") pod "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" (UID: "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.945402 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" (UID: "4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972810 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972844 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972858 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gczjd\" (UniqueName: \"kubernetes.io/projected/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-kube-api-access-gczjd\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972892 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972903 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972913 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df5rt\" (UniqueName: \"kubernetes.io/projected/822259c6-fea2-44cb-9a09-d6415a92e71e-kube-api-access-df5rt\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972924 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972934 4658 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/822259c6-fea2-44cb-9a09-d6415a92e71e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972945 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:44 crc kubenswrapper[4658]: I1002 11:37:44.972978 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822259c6-fea2-44cb-9a09-d6415a92e71e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.362321 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78b685455c-5zn4s" event={"ID":"822259c6-fea2-44cb-9a09-d6415a92e71e","Type":"ContainerDied","Data":"385e17e93764f1535f16a15115d98f8494f9e38c1c0acf4eca4d8efe47dd3631"} Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.362642 4658 scope.go:117] "RemoveContainer" containerID="8600e21cfa45a23fc9cec0c62d4292779da66d2c8bb391a471760b22b24a0407" Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.362711 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78b685455c-5zn4s" Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.373940 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" event={"ID":"4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e","Type":"ContainerDied","Data":"cb59ec3dd88be11738ebcb1bd075df6e35f544de9cbbcf8b7dbb929e1dd29f15"} Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.374016 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.473924 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78b685455c-5zn4s"] Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.495803 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-78b685455c-5zn4s"] Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.506768 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-dt2rk"] Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.521943 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-dt2rk"] Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.601009 4658 scope.go:117] "RemoveContainer" containerID="92c61fede454e4d930a8b2a7c7439bd70f8ca71bee1ff88d0510a17803277073" Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.683010 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cbc95469d-r9kbr"] Oct 02 11:37:45 crc kubenswrapper[4658]: W1002 11:37:45.722711 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod456bb611_ccbc_4d1b_94bf_2ceb7d8345e3.slice/crio-605a2069cb462aec6955abbf1f7f3617bf9213dd43766029a5a53ca7e97a6f13 WatchSource:0}: Error finding container 605a2069cb462aec6955abbf1f7f3617bf9213dd43766029a5a53ca7e97a6f13: Status 404 returned error can't find the container with id 605a2069cb462aec6955abbf1f7f3617bf9213dd43766029a5a53ca7e97a6f13 Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.869233 4658 scope.go:117] "RemoveContainer" containerID="8b8978037794019d6beb72573549347d8a8c005dcddd7b968afa1939cd31e5b2" Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.902681 4658 scope.go:117] "RemoveContainer" containerID="577c07b1fada8fcf0ae5fda670b69dd3dd76b12419431abed82afd10b6ea93bf" Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.965676 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" path="/var/lib/kubelet/pods/4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e/volumes" Oct 02 11:37:45 crc kubenswrapper[4658]: I1002 11:37:45.966610 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822259c6-fea2-44cb-9a09-d6415a92e71e" path="/var/lib/kubelet/pods/822259c6-fea2-44cb-9a09-d6415a92e71e/volumes" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.403170 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-698b689fd7-9wp8g" event={"ID":"e5fc61f1-3fdf-430c-890e-4e220859285b","Type":"ContainerStarted","Data":"27323ea431e06476586200d3674a18ba31423be49e868eb2481726ca1521bdaa"} Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.407803 4658 generic.go:334] "Generic (PLEG): container finished" podID="218ca390-9242-4dba-8899-0852cbc26bea" containerID="a87b6456f4a87208b841bf7062661b0b0d8c6155e0207147a6e0c052f784ccd0" exitCode=0 Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.407877 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-565ccbd57b-kt62s" event={"ID":"218ca390-9242-4dba-8899-0852cbc26bea","Type":"ContainerDied","Data":"a87b6456f4a87208b841bf7062661b0b0d8c6155e0207147a6e0c052f784ccd0"} Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.479859 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerStarted","Data":"6170eda60cb1ce56430233a3ec6e7771883331fe0bdce6340a2d8d1a31a7d08a"} Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.480051 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="ceilometer-central-agent" containerID="cri-o://216113a97fe44bd440b15a58eef28be0f657c4645f1843967f36267fbdae5183" gracePeriod=30 Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.480126 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.480492 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="proxy-httpd" containerID="cri-o://6170eda60cb1ce56430233a3ec6e7771883331fe0bdce6340a2d8d1a31a7d08a" gracePeriod=30 Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.480539 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="sg-core" containerID="cri-o://addab0ee3e29b566f6f1e77866cdc5a9366156f8316815205c33f7ae44eff9c5" gracePeriod=30 Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.480563 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="ceilometer-notification-agent" containerID="cri-o://630370b49e9083126045ee666e011d798ddc1cb0fc91e00dc2c8d769d93a324d" gracePeriod=30 Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.492006 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b027647-05c5-4977-a5b8-498cc9cc5dc1","Type":"ContainerStarted","Data":"38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e"} Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.509873 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.343068203 podStartE2EDuration="1m16.509854987s" podCreationTimestamp="2025-10-02 11:36:30 +0000 UTC" firstStartedPulling="2025-10-02 11:36:32.195007812 +0000 UTC m=+1073.086161389" lastFinishedPulling="2025-10-02 11:37:45.361794606 +0000 UTC m=+1146.252948173" observedRunningTime="2025-10-02 11:37:46.508347288 +0000 UTC m=+1147.399500855" watchObservedRunningTime="2025-10-02 11:37:46.509854987 +0000 UTC m=+1147.401008554" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.512851 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerName="cinder-api-log" containerID="cri-o://6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633" gracePeriod=30 Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.512960 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9e4f67c-d66b-41ba-9bec-920e299e7110","Type":"ContainerStarted","Data":"a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b"} Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.513015 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerName="cinder-api" containerID="cri-o://a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b" gracePeriod=30 Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.513389 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.523270 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.524768 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" event={"ID":"ed9f1355-f34e-479c-8030-c2848860beb6","Type":"ContainerStarted","Data":"a2f697b9d02d808fb4a6fcf95f51bb1f1cc513534d164d340512e29c04ed4082"} Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.524806 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" event={"ID":"ed9f1355-f34e-479c-8030-c2848860beb6","Type":"ContainerStarted","Data":"127e4a863704d2b7e0d2390828bdddc2dc8143377e69676281200153c689023f"} Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.536115 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbc95469d-r9kbr" event={"ID":"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3","Type":"ContainerStarted","Data":"12b7e5ef684ea634f6ab84915e569caa25026373e756f6f559c2c8905e1acd90"} Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.536168 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbc95469d-r9kbr" event={"ID":"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3","Type":"ContainerStarted","Data":"605a2069cb462aec6955abbf1f7f3617bf9213dd43766029a5a53ca7e97a6f13"} Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.536411 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.537078 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.560210 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=14.560194177 podStartE2EDuration="14.560194177s" podCreationTimestamp="2025-10-02 11:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:46.555268488 +0000 UTC m=+1147.446422055" watchObservedRunningTime="2025-10-02 11:37:46.560194177 +0000 UTC m=+1147.451347744" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.618397 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-ovndb-tls-certs\") pod \"218ca390-9242-4dba-8899-0852cbc26bea\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.618466 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-httpd-config\") pod \"218ca390-9242-4dba-8899-0852cbc26bea\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.618503 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-config\") pod \"218ca390-9242-4dba-8899-0852cbc26bea\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.618630 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-combined-ca-bundle\") pod \"218ca390-9242-4dba-8899-0852cbc26bea\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.618705 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttrvz\" (UniqueName: \"kubernetes.io/projected/218ca390-9242-4dba-8899-0852cbc26bea-kube-api-access-ttrvz\") pod \"218ca390-9242-4dba-8899-0852cbc26bea\" (UID: \"218ca390-9242-4dba-8899-0852cbc26bea\") " Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.642841 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218ca390-9242-4dba-8899-0852cbc26bea-kube-api-access-ttrvz" (OuterVolumeSpecName: "kube-api-access-ttrvz") pod "218ca390-9242-4dba-8899-0852cbc26bea" (UID: "218ca390-9242-4dba-8899-0852cbc26bea"). InnerVolumeSpecName "kube-api-access-ttrvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.652080 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "218ca390-9242-4dba-8899-0852cbc26bea" (UID: "218ca390-9242-4dba-8899-0852cbc26bea"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.689756 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cbc95469d-r9kbr" podStartSLOduration=9.689715929 podStartE2EDuration="9.689715929s" podCreationTimestamp="2025-10-02 11:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:46.618011706 +0000 UTC m=+1147.509165273" watchObservedRunningTime="2025-10-02 11:37:46.689715929 +0000 UTC m=+1147.580869496" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.714938 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-54ff5bbf66-pmxfv" podStartSLOduration=4.676775038 podStartE2EDuration="17.714916785s" podCreationTimestamp="2025-10-02 11:37:29 +0000 UTC" firstStartedPulling="2025-10-02 11:37:31.542443231 +0000 UTC m=+1132.433596798" lastFinishedPulling="2025-10-02 11:37:44.580584968 +0000 UTC m=+1145.471738545" observedRunningTime="2025-10-02 11:37:46.669910665 +0000 UTC m=+1147.561064232" watchObservedRunningTime="2025-10-02 11:37:46.714916785 +0000 UTC m=+1147.606070352" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.721181 4658 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.721210 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttrvz\" (UniqueName: \"kubernetes.io/projected/218ca390-9242-4dba-8899-0852cbc26bea-kube-api-access-ttrvz\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.766405 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-config" (OuterVolumeSpecName: "config") pod "218ca390-9242-4dba-8899-0852cbc26bea" (UID: "218ca390-9242-4dba-8899-0852cbc26bea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.826254 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.851574 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "218ca390-9242-4dba-8899-0852cbc26bea" (UID: "218ca390-9242-4dba-8899-0852cbc26bea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.872406 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "218ca390-9242-4dba-8899-0852cbc26bea" (UID: "218ca390-9242-4dba-8899-0852cbc26bea"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.928659 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:46 crc kubenswrapper[4658]: I1002 11:37:46.929523 4658 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218ca390-9242-4dba-8899-0852cbc26bea-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.335582 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.440162 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-scripts\") pod \"b9e4f67c-d66b-41ba-9bec-920e299e7110\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.440283 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data-custom\") pod \"b9e4f67c-d66b-41ba-9bec-920e299e7110\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.440427 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e4f67c-d66b-41ba-9bec-920e299e7110-logs\") pod \"b9e4f67c-d66b-41ba-9bec-920e299e7110\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.440461 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48kc6\" (UniqueName: \"kubernetes.io/projected/b9e4f67c-d66b-41ba-9bec-920e299e7110-kube-api-access-48kc6\") pod \"b9e4f67c-d66b-41ba-9bec-920e299e7110\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.440504 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-combined-ca-bundle\") pod \"b9e4f67c-d66b-41ba-9bec-920e299e7110\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.440606 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data\") pod \"b9e4f67c-d66b-41ba-9bec-920e299e7110\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.440642 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9e4f67c-d66b-41ba-9bec-920e299e7110-etc-machine-id\") pod \"b9e4f67c-d66b-41ba-9bec-920e299e7110\" (UID: \"b9e4f67c-d66b-41ba-9bec-920e299e7110\") " Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.441287 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9e4f67c-d66b-41ba-9bec-920e299e7110-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b9e4f67c-d66b-41ba-9bec-920e299e7110" (UID: "b9e4f67c-d66b-41ba-9bec-920e299e7110"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.441825 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9e4f67c-d66b-41ba-9bec-920e299e7110-logs" (OuterVolumeSpecName: "logs") pod "b9e4f67c-d66b-41ba-9bec-920e299e7110" (UID: "b9e4f67c-d66b-41ba-9bec-920e299e7110"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.451529 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e4f67c-d66b-41ba-9bec-920e299e7110-kube-api-access-48kc6" (OuterVolumeSpecName: "kube-api-access-48kc6") pod "b9e4f67c-d66b-41ba-9bec-920e299e7110" (UID: "b9e4f67c-d66b-41ba-9bec-920e299e7110"). InnerVolumeSpecName "kube-api-access-48kc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.451665 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b9e4f67c-d66b-41ba-9bec-920e299e7110" (UID: "b9e4f67c-d66b-41ba-9bec-920e299e7110"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.459674 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-scripts" (OuterVolumeSpecName: "scripts") pod "b9e4f67c-d66b-41ba-9bec-920e299e7110" (UID: "b9e4f67c-d66b-41ba-9bec-920e299e7110"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.480251 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9e4f67c-d66b-41ba-9bec-920e299e7110" (UID: "b9e4f67c-d66b-41ba-9bec-920e299e7110"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.506102 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data" (OuterVolumeSpecName: "config-data") pod "b9e4f67c-d66b-41ba-9bec-920e299e7110" (UID: "b9e4f67c-d66b-41ba-9bec-920e299e7110"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.543693 4658 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.543742 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e4f67c-d66b-41ba-9bec-920e299e7110-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.543755 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48kc6\" (UniqueName: \"kubernetes.io/projected/b9e4f67c-d66b-41ba-9bec-920e299e7110-kube-api-access-48kc6\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.543768 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.543780 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.543791 4658 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9e4f67c-d66b-41ba-9bec-920e299e7110-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.543800 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9e4f67c-d66b-41ba-9bec-920e299e7110-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.579873 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-698b689fd7-9wp8g" event={"ID":"e5fc61f1-3fdf-430c-890e-4e220859285b","Type":"ContainerStarted","Data":"6c86d867b70104934a18e2a78816db431c983bccc971262985ed3d81a812eb24"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.581889 4658 generic.go:334] "Generic (PLEG): container finished" podID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerID="a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b" exitCode=0 Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.581919 4658 generic.go:334] "Generic (PLEG): container finished" podID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerID="6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633" exitCode=143 Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.582012 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.582054 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9e4f67c-d66b-41ba-9bec-920e299e7110","Type":"ContainerDied","Data":"a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.582125 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9e4f67c-d66b-41ba-9bec-920e299e7110","Type":"ContainerDied","Data":"6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.582143 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9e4f67c-d66b-41ba-9bec-920e299e7110","Type":"ContainerDied","Data":"e5738d51e2f2cf1d3497b15eec2de5c1599a8b6b8308c6dce80a9c9ef1b30331"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.582167 4658 scope.go:117] "RemoveContainer" containerID="a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.600568 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbc95469d-r9kbr" event={"ID":"456bb611-ccbc-4d1b-94bf-2ceb7d8345e3","Type":"ContainerStarted","Data":"aa40338807061769d82cd75a68e9c9b8ddae820039530ad776f54e576d958b5b"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.604048 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-565ccbd57b-kt62s" event={"ID":"218ca390-9242-4dba-8899-0852cbc26bea","Type":"ContainerDied","Data":"d78dc895ea1f5f03b58b8b06d6046e0521925ac48eac5bc332224a9e7c9be1ee"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.604131 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-565ccbd57b-kt62s" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.613238 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-698b689fd7-9wp8g" podStartSLOduration=5.388033168 podStartE2EDuration="18.613224218s" podCreationTimestamp="2025-10-02 11:37:29 +0000 UTC" firstStartedPulling="2025-10-02 11:37:32.064267542 +0000 UTC m=+1132.955421109" lastFinishedPulling="2025-10-02 11:37:45.289458592 +0000 UTC m=+1146.180612159" observedRunningTime="2025-10-02 11:37:47.611639717 +0000 UTC m=+1148.502793284" watchObservedRunningTime="2025-10-02 11:37:47.613224218 +0000 UTC m=+1148.504377785" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.623819 4658 generic.go:334] "Generic (PLEG): container finished" podID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerID="6170eda60cb1ce56430233a3ec6e7771883331fe0bdce6340a2d8d1a31a7d08a" exitCode=0 Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.624156 4658 generic.go:334] "Generic (PLEG): container finished" podID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerID="addab0ee3e29b566f6f1e77866cdc5a9366156f8316815205c33f7ae44eff9c5" exitCode=2 Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.624170 4658 generic.go:334] "Generic (PLEG): container finished" podID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerID="216113a97fe44bd440b15a58eef28be0f657c4645f1843967f36267fbdae5183" exitCode=0 Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.624245 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerDied","Data":"6170eda60cb1ce56430233a3ec6e7771883331fe0bdce6340a2d8d1a31a7d08a"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.624280 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerDied","Data":"addab0ee3e29b566f6f1e77866cdc5a9366156f8316815205c33f7ae44eff9c5"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.624314 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerDied","Data":"216113a97fe44bd440b15a58eef28be0f657c4645f1843967f36267fbdae5183"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.630358 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b027647-05c5-4977-a5b8-498cc9cc5dc1","Type":"ContainerStarted","Data":"7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9"} Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.674408 4658 scope.go:117] "RemoveContainer" containerID="6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.682101 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.427536601 podStartE2EDuration="15.68208258s" podCreationTimestamp="2025-10-02 11:37:32 +0000 UTC" firstStartedPulling="2025-10-02 11:37:33.326031799 +0000 UTC m=+1134.217185366" lastFinishedPulling="2025-10-02 11:37:44.580577778 +0000 UTC m=+1145.471731345" observedRunningTime="2025-10-02 11:37:47.65454924 +0000 UTC m=+1148.545702817" watchObservedRunningTime="2025-10-02 11:37:47.68208258 +0000 UTC m=+1148.573236137" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.689883 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.716391 4658 scope.go:117] "RemoveContainer" containerID="a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.722381 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.723206 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b\": container with ID starting with a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b not found: ID does not exist" containerID="a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.723249 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b"} err="failed to get container status \"a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b\": rpc error: code = NotFound desc = could not find container \"a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b\": container with ID starting with a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b not found: ID does not exist" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.723319 4658 scope.go:117] "RemoveContainer" containerID="6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633" Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.724395 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633\": container with ID starting with 6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633 not found: ID does not exist" containerID="6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.724443 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633"} err="failed to get container status \"6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633\": rpc error: code = NotFound desc = could not find container \"6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633\": container with ID starting with 6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633 not found: ID does not exist" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.724469 4658 scope.go:117] "RemoveContainer" containerID="a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.724960 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b"} err="failed to get container status \"a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b\": rpc error: code = NotFound desc = could not find container \"a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b\": container with ID starting with a8a1eb56362a2cef5a3093744dd9dc17570de5d21fb9a804c1bec85457e83b6b not found: ID does not exist" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.724999 4658 scope.go:117] "RemoveContainer" containerID="6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.725236 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633"} err="failed to get container status \"6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633\": rpc error: code = NotFound desc = could not find container \"6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633\": container with ID starting with 6a12da3e8693ca580076e077ea1278f8f4009e055b7b0d08eb18034006a07633 not found: ID does not exist" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.725256 4658 scope.go:117] "RemoveContainer" containerID="631d442b4a9a3c355372f7c5b9ab59e21341e6a46715ca6efb040d0d65ee0f67" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.756194 4658 scope.go:117] "RemoveContainer" containerID="a87b6456f4a87208b841bf7062661b0b0d8c6155e0207147a6e0c052f784ccd0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.768459 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-565ccbd57b-kt62s"] Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.777082 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-565ccbd57b-kt62s"] Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.784914 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.785448 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerName="horizon" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785477 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerName="horizon" Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.785499 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" containerName="init" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785546 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" containerName="init" Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.785567 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerName="cinder-api-log" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785598 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerName="cinder-api-log" Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.785611 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerName="horizon-log" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785619 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerName="horizon-log" Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.785642 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218ca390-9242-4dba-8899-0852cbc26bea" containerName="neutron-httpd" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785652 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="218ca390-9242-4dba-8899-0852cbc26bea" containerName="neutron-httpd" Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.785668 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerName="cinder-api" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785676 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerName="cinder-api" Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.785687 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218ca390-9242-4dba-8899-0852cbc26bea" containerName="neutron-api" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785695 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="218ca390-9242-4dba-8899-0852cbc26bea" containerName="neutron-api" Oct 02 11:37:47 crc kubenswrapper[4658]: E1002 11:37:47.785711 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" containerName="dnsmasq-dns" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785718 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" containerName="dnsmasq-dns" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785959 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerName="horizon-log" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.785992 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="822259c6-fea2-44cb-9a09-d6415a92e71e" containerName="horizon" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.786007 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerName="cinder-api" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.786025 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="218ca390-9242-4dba-8899-0852cbc26bea" containerName="neutron-api" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.786046 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e4f67c-d66b-41ba-9bec-920e299e7110" containerName="cinder-api-log" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.786068 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" containerName="dnsmasq-dns" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.786081 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="218ca390-9242-4dba-8899-0852cbc26bea" containerName="neutron-httpd" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.787498 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.819790 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.820157 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.820675 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.827584 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.858195 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-config-data\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.858389 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzdj\" (UniqueName: \"kubernetes.io/projected/ca5cc232-0768-4541-b654-03a61ffd7ddc-kube-api-access-lmzdj\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.858644 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.858771 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-scripts\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.858813 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.858879 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca5cc232-0768-4541-b654-03a61ffd7ddc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.859018 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca5cc232-0768-4541-b654-03a61ffd7ddc-logs\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.859131 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.859420 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.961925 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzdj\" (UniqueName: \"kubernetes.io/projected/ca5cc232-0768-4541-b654-03a61ffd7ddc-kube-api-access-lmzdj\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.962046 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.962098 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-scripts\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.962157 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.962224 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca5cc232-0768-4541-b654-03a61ffd7ddc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.962249 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca5cc232-0768-4541-b654-03a61ffd7ddc-logs\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.962329 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.962488 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.962613 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-config-data\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.965660 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca5cc232-0768-4541-b654-03a61ffd7ddc-logs\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.965742 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca5cc232-0768-4541-b654-03a61ffd7ddc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.968953 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-scripts\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.970206 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.972338 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.972757 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.973458 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.978242 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5cc232-0768-4541-b654-03a61ffd7ddc-config-data\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.986476 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzdj\" (UniqueName: \"kubernetes.io/projected/ca5cc232-0768-4541-b654-03a61ffd7ddc-kube-api-access-lmzdj\") pod \"cinder-api-0\" (UID: \"ca5cc232-0768-4541-b654-03a61ffd7ddc\") " pod="openstack/cinder-api-0" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.986913 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218ca390-9242-4dba-8899-0852cbc26bea" path="/var/lib/kubelet/pods/218ca390-9242-4dba-8899-0852cbc26bea/volumes" Oct 02 11:37:47 crc kubenswrapper[4658]: I1002 11:37:47.987904 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e4f67c-d66b-41ba-9bec-920e299e7110" path="/var/lib/kubelet/pods/b9e4f67c-d66b-41ba-9bec-920e299e7110/volumes" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.065344 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.066432 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-574d544bd8-7g449" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.143430 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.630502 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.653095 4658 generic.go:334] "Generic (PLEG): container finished" podID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerID="630370b49e9083126045ee666e011d798ddc1cb0fc91e00dc2c8d769d93a324d" exitCode=0 Oct 02 11:37:48 crc kubenswrapper[4658]: W1002 11:37:48.653130 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca5cc232_0768_4541_b654_03a61ffd7ddc.slice/crio-03a2e87a43a99ded66a37dfdecf81a0b6cc90be9fa4d74337f7bec3ab0f003fa WatchSource:0}: Error finding container 03a2e87a43a99ded66a37dfdecf81a0b6cc90be9fa4d74337f7bec3ab0f003fa: Status 404 returned error can't find the container with id 03a2e87a43a99ded66a37dfdecf81a0b6cc90be9fa4d74337f7bec3ab0f003fa Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.653130 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerDied","Data":"630370b49e9083126045ee666e011d798ddc1cb0fc91e00dc2c8d769d93a324d"} Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.820674 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.884222 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw9cx\" (UniqueName: \"kubernetes.io/projected/cd5709aa-c4aa-4577-b3cb-e518acf890f1-kube-api-access-cw9cx\") pod \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.884315 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-log-httpd\") pod \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.884363 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-config-data\") pod \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.884425 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-sg-core-conf-yaml\") pod \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.884479 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-combined-ca-bundle\") pod \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.884523 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-scripts\") pod \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.884630 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-run-httpd\") pod \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\" (UID: \"cd5709aa-c4aa-4577-b3cb-e518acf890f1\") " Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.885439 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cd5709aa-c4aa-4577-b3cb-e518acf890f1" (UID: "cd5709aa-c4aa-4577-b3cb-e518acf890f1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.885584 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cd5709aa-c4aa-4577-b3cb-e518acf890f1" (UID: "cd5709aa-c4aa-4577-b3cb-e518acf890f1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.891480 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-scripts" (OuterVolumeSpecName: "scripts") pod "cd5709aa-c4aa-4577-b3cb-e518acf890f1" (UID: "cd5709aa-c4aa-4577-b3cb-e518acf890f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.909011 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5709aa-c4aa-4577-b3cb-e518acf890f1-kube-api-access-cw9cx" (OuterVolumeSpecName: "kube-api-access-cw9cx") pod "cd5709aa-c4aa-4577-b3cb-e518acf890f1" (UID: "cd5709aa-c4aa-4577-b3cb-e518acf890f1"). InnerVolumeSpecName "kube-api-access-cw9cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.931457 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cd5709aa-c4aa-4577-b3cb-e518acf890f1" (UID: "cd5709aa-c4aa-4577-b3cb-e518acf890f1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.987860 4658 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.987891 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw9cx\" (UniqueName: \"kubernetes.io/projected/cd5709aa-c4aa-4577-b3cb-e518acf890f1-kube-api-access-cw9cx\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.987900 4658 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd5709aa-c4aa-4577-b3cb-e518acf890f1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.987909 4658 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:48 crc kubenswrapper[4658]: I1002 11:37:48.987917 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.043951 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd5709aa-c4aa-4577-b3cb-e518acf890f1" (UID: "cd5709aa-c4aa-4577-b3cb-e518acf890f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.064979 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-config-data" (OuterVolumeSpecName: "config-data") pod "cd5709aa-c4aa-4577-b3cb-e518acf890f1" (UID: "cd5709aa-c4aa-4577-b3cb-e518acf890f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.089571 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.089617 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5709aa-c4aa-4577-b3cb-e518acf890f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.232979 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-84b966f6c9-dt2rk" podUID="4bf3eb4a-fbb7-43ea-a2af-5a92bbc57b0e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.165:5353: i/o timeout" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.668418 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca5cc232-0768-4541-b654-03a61ffd7ddc","Type":"ContainerStarted","Data":"38217f7268c6957dc51e143c61fc3db17e2cd3da15428bd1b2d35f14eab40514"} Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.668786 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca5cc232-0768-4541-b654-03a61ffd7ddc","Type":"ContainerStarted","Data":"03a2e87a43a99ded66a37dfdecf81a0b6cc90be9fa4d74337f7bec3ab0f003fa"} Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.672030 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd5709aa-c4aa-4577-b3cb-e518acf890f1","Type":"ContainerDied","Data":"ed563ed6debbdbe889ef3cf32b42e9f82aa88c2dae4b5936a87d2ec4f4c0c628"} Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.672087 4658 scope.go:117] "RemoveContainer" containerID="6170eda60cb1ce56430233a3ec6e7771883331fe0bdce6340a2d8d1a31a7d08a" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.672380 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.726130 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.747741 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.753640 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:37:49 crc kubenswrapper[4658]: E1002 11:37:49.754020 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="ceilometer-central-agent" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.754034 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="ceilometer-central-agent" Oct 02 11:37:49 crc kubenswrapper[4658]: E1002 11:37:49.754051 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="ceilometer-notification-agent" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.754058 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="ceilometer-notification-agent" Oct 02 11:37:49 crc kubenswrapper[4658]: E1002 11:37:49.754070 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="proxy-httpd" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.754077 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="proxy-httpd" Oct 02 11:37:49 crc kubenswrapper[4658]: E1002 11:37:49.754098 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="sg-core" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.754103 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="sg-core" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.754322 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="ceilometer-notification-agent" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.754332 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="sg-core" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.754346 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="proxy-httpd" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.754358 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" containerName="ceilometer-central-agent" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.761218 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.771240 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.771573 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.781031 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.792181 4658 scope.go:117] "RemoveContainer" containerID="addab0ee3e29b566f6f1e77866cdc5a9366156f8316815205c33f7ae44eff9c5" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.800306 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-scripts\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.800375 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.800402 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-run-httpd\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.800444 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d42m4\" (UniqueName: \"kubernetes.io/projected/d2760468-fb22-4275-8906-8bc5981ab243-kube-api-access-d42m4\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.800518 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-log-httpd\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.800622 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-config-data\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.800657 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.905409 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-log-httpd\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.905537 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-config-data\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.905581 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.905614 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-scripts\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.905651 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.905676 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-run-httpd\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.905717 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d42m4\" (UniqueName: \"kubernetes.io/projected/d2760468-fb22-4275-8906-8bc5981ab243-kube-api-access-d42m4\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.906584 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-log-httpd\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.907556 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-run-httpd\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.915459 4658 scope.go:117] "RemoveContainer" containerID="630370b49e9083126045ee666e011d798ddc1cb0fc91e00dc2c8d769d93a324d" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.916230 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-scripts\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.916254 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.917777 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-config-data\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.922287 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:49 crc kubenswrapper[4658]: I1002 11:37:49.938007 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d42m4\" (UniqueName: \"kubernetes.io/projected/d2760468-fb22-4275-8906-8bc5981ab243-kube-api-access-d42m4\") pod \"ceilometer-0\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " pod="openstack/ceilometer-0" Oct 02 11:37:50 crc kubenswrapper[4658]: I1002 11:37:50.009560 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5709aa-c4aa-4577-b3cb-e518acf890f1" path="/var/lib/kubelet/pods/cd5709aa-c4aa-4577-b3cb-e518acf890f1/volumes" Oct 02 11:37:50 crc kubenswrapper[4658]: I1002 11:37:50.091468 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:37:50 crc kubenswrapper[4658]: I1002 11:37:50.094519 4658 scope.go:117] "RemoveContainer" containerID="216113a97fe44bd440b15a58eef28be0f657c4645f1843967f36267fbdae5183" Oct 02 11:37:50 crc kubenswrapper[4658]: I1002 11:37:50.693927 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:37:50 crc kubenswrapper[4658]: I1002 11:37:50.998531 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7db8df9d95-jgkgn" Oct 02 11:37:51 crc kubenswrapper[4658]: I1002 11:37:51.701373 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerStarted","Data":"852959d62e49ef499d3c00e9dd5c186978699ff3790325796250c0d73ec328c0"} Oct 02 11:37:51 crc kubenswrapper[4658]: I1002 11:37:51.702857 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerStarted","Data":"a87c22b84f32e87b02364670800850ac243de891f636577ab0b961ac332476e8"} Oct 02 11:37:51 crc kubenswrapper[4658]: I1002 11:37:51.703445 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca5cc232-0768-4541-b654-03a61ffd7ddc","Type":"ContainerStarted","Data":"2e08bb8444510d9d1341669a4665c47f1c30884f4a65f4f244202c6cc0599810"} Oct 02 11:37:51 crc kubenswrapper[4658]: I1002 11:37:51.703728 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 11:37:51 crc kubenswrapper[4658]: I1002 11:37:51.722470 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.722451382 podStartE2EDuration="4.722451382s" podCreationTimestamp="2025-10-02 11:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:51.719131156 +0000 UTC m=+1152.610284723" watchObservedRunningTime="2025-10-02 11:37:51.722451382 +0000 UTC m=+1152.613604949" Oct 02 11:37:52 crc kubenswrapper[4658]: I1002 11:37:52.575188 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 11:37:52 crc kubenswrapper[4658]: I1002 11:37:52.714763 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerStarted","Data":"f328f894b667b9f07305574c062db679980c25137e21e4eda2db2c6d40c738aa"} Oct 02 11:37:52 crc kubenswrapper[4658]: I1002 11:37:52.715649 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerStarted","Data":"b3ad37ad31309db8ad340b99cb49efb6c80cb8c124e588849eeabfb378663d32"} Oct 02 11:37:52 crc kubenswrapper[4658]: I1002 11:37:52.830644 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 11:37:52 crc kubenswrapper[4658]: I1002 11:37:52.882541 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.334775 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.336663 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.338542 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-prlwg" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.339267 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.340130 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.350057 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.390608 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d4842f-7f97-4191-bcea-c8076517503f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.390950 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn74d\" (UniqueName: \"kubernetes.io/projected/53d4842f-7f97-4191-bcea-c8076517503f-kube-api-access-jn74d\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.391056 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/53d4842f-7f97-4191-bcea-c8076517503f-openstack-config-secret\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.391122 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/53d4842f-7f97-4191-bcea-c8076517503f-openstack-config\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.493119 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d4842f-7f97-4191-bcea-c8076517503f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.493271 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn74d\" (UniqueName: \"kubernetes.io/projected/53d4842f-7f97-4191-bcea-c8076517503f-kube-api-access-jn74d\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.493374 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/53d4842f-7f97-4191-bcea-c8076517503f-openstack-config-secret\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.493404 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/53d4842f-7f97-4191-bcea-c8076517503f-openstack-config\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.494236 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/53d4842f-7f97-4191-bcea-c8076517503f-openstack-config\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.499405 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/53d4842f-7f97-4191-bcea-c8076517503f-openstack-config-secret\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.500163 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d4842f-7f97-4191-bcea-c8076517503f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.510898 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn74d\" (UniqueName: \"kubernetes.io/projected/53d4842f-7f97-4191-bcea-c8076517503f-kube-api-access-jn74d\") pod \"openstackclient\" (UID: \"53d4842f-7f97-4191-bcea-c8076517503f\") " pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.656451 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.726903 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerName="cinder-scheduler" containerID="cri-o://38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e" gracePeriod=30 Oct 02 11:37:53 crc kubenswrapper[4658]: I1002 11:37:53.726964 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerName="probe" containerID="cri-o://7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9" gracePeriod=30 Oct 02 11:37:54 crc kubenswrapper[4658]: I1002 11:37:54.290458 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:37:54 crc kubenswrapper[4658]: I1002 11:37:54.740196 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerStarted","Data":"bd3cd22dc3f41d16382612e731cbb2ee2836e096a7b11f9f92e2834d0e0e7fa0"} Oct 02 11:37:54 crc kubenswrapper[4658]: I1002 11:37:54.741034 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:37:54 crc kubenswrapper[4658]: I1002 11:37:54.743839 4658 generic.go:334] "Generic (PLEG): container finished" podID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerID="7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9" exitCode=0 Oct 02 11:37:54 crc kubenswrapper[4658]: I1002 11:37:54.743893 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b027647-05c5-4977-a5b8-498cc9cc5dc1","Type":"ContainerDied","Data":"7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9"} Oct 02 11:37:54 crc kubenswrapper[4658]: I1002 11:37:54.745522 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"53d4842f-7f97-4191-bcea-c8076517503f","Type":"ContainerStarted","Data":"d9ae3710febdee7bb66a851e4d530ed5ef26b4f24070f4ad4ecdc005955a6853"} Oct 02 11:37:54 crc kubenswrapper[4658]: I1002 11:37:54.775168 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.638370033 podStartE2EDuration="5.775152463s" podCreationTimestamp="2025-10-02 11:37:49 +0000 UTC" firstStartedPulling="2025-10-02 11:37:50.692555021 +0000 UTC m=+1151.583708588" lastFinishedPulling="2025-10-02 11:37:53.829337451 +0000 UTC m=+1154.720491018" observedRunningTime="2025-10-02 11:37:54.771916619 +0000 UTC m=+1155.663070186" watchObservedRunningTime="2025-10-02 11:37:54.775152463 +0000 UTC m=+1155.666306030" Oct 02 11:37:54 crc kubenswrapper[4658]: I1002 11:37:54.802809 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:55 crc kubenswrapper[4658]: I1002 11:37:55.109971 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cbc95469d-r9kbr" Oct 02 11:37:55 crc kubenswrapper[4658]: I1002 11:37:55.210573 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b994d9586-748rf"] Oct 02 11:37:55 crc kubenswrapper[4658]: I1002 11:37:55.210870 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b994d9586-748rf" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api-log" containerID="cri-o://68b0858514747e0d4500cf25af1cfbf261b92bd552375fafbecb4cfcd700f6af" gracePeriod=30 Oct 02 11:37:55 crc kubenswrapper[4658]: I1002 11:37:55.211390 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b994d9586-748rf" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api" containerID="cri-o://564592b1e8dfa07ab07844b9184cf713b55f93e26bc4041812e4b1c9a6fb5b8b" gracePeriod=30 Oct 02 11:37:55 crc kubenswrapper[4658]: I1002 11:37:55.761853 4658 generic.go:334] "Generic (PLEG): container finished" podID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerID="68b0858514747e0d4500cf25af1cfbf261b92bd552375fafbecb4cfcd700f6af" exitCode=143 Oct 02 11:37:55 crc kubenswrapper[4658]: I1002 11:37:55.761943 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b994d9586-748rf" event={"ID":"39084258-a9f4-4b1e-9a7c-d0e622c39479","Type":"ContainerDied","Data":"68b0858514747e0d4500cf25af1cfbf261b92bd552375fafbecb4cfcd700f6af"} Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.317981 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.358801 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-combined-ca-bundle\") pod \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.358925 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data-custom\") pod \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.359039 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-scripts\") pod \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.359144 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data\") pod \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.359177 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wxjn\" (UniqueName: \"kubernetes.io/projected/5b027647-05c5-4977-a5b8-498cc9cc5dc1-kube-api-access-5wxjn\") pod \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.359198 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b027647-05c5-4977-a5b8-498cc9cc5dc1-etc-machine-id\") pod \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\" (UID: \"5b027647-05c5-4977-a5b8-498cc9cc5dc1\") " Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.360348 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b027647-05c5-4977-a5b8-498cc9cc5dc1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5b027647-05c5-4977-a5b8-498cc9cc5dc1" (UID: "5b027647-05c5-4977-a5b8-498cc9cc5dc1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.367249 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-scripts" (OuterVolumeSpecName: "scripts") pod "5b027647-05c5-4977-a5b8-498cc9cc5dc1" (UID: "5b027647-05c5-4977-a5b8-498cc9cc5dc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.367254 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b027647-05c5-4977-a5b8-498cc9cc5dc1-kube-api-access-5wxjn" (OuterVolumeSpecName: "kube-api-access-5wxjn") pod "5b027647-05c5-4977-a5b8-498cc9cc5dc1" (UID: "5b027647-05c5-4977-a5b8-498cc9cc5dc1"). InnerVolumeSpecName "kube-api-access-5wxjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.371456 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5b027647-05c5-4977-a5b8-498cc9cc5dc1" (UID: "5b027647-05c5-4977-a5b8-498cc9cc5dc1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.464476 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wxjn\" (UniqueName: \"kubernetes.io/projected/5b027647-05c5-4977-a5b8-498cc9cc5dc1-kube-api-access-5wxjn\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.464518 4658 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b027647-05c5-4977-a5b8-498cc9cc5dc1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.464537 4658 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.464549 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.478819 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b027647-05c5-4977-a5b8-498cc9cc5dc1" (UID: "5b027647-05c5-4977-a5b8-498cc9cc5dc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.502105 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data" (OuterVolumeSpecName: "config-data") pod "5b027647-05c5-4977-a5b8-498cc9cc5dc1" (UID: "5b027647-05c5-4977-a5b8-498cc9cc5dc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.566153 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.566185 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b027647-05c5-4977-a5b8-498cc9cc5dc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.787995 4658 generic.go:334] "Generic (PLEG): container finished" podID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerID="38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e" exitCode=0 Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.788054 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b027647-05c5-4977-a5b8-498cc9cc5dc1","Type":"ContainerDied","Data":"38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e"} Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.788065 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.788091 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b027647-05c5-4977-a5b8-498cc9cc5dc1","Type":"ContainerDied","Data":"1b8b6f0da6a98eacb9d62e860a4df1c10eb10910b16db7538f930e583c9ca46d"} Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.788105 4658 scope.go:117] "RemoveContainer" containerID="7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.835886 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.842443 4658 scope.go:117] "RemoveContainer" containerID="38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.871399 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.876198 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:37:56 crc kubenswrapper[4658]: E1002 11:37:56.879998 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerName="probe" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.880041 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerName="probe" Oct 02 11:37:56 crc kubenswrapper[4658]: E1002 11:37:56.880099 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerName="cinder-scheduler" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.880109 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerName="cinder-scheduler" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.880342 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerName="probe" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.880380 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" containerName="cinder-scheduler" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.896804 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.909042 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.919886 4658 scope.go:117] "RemoveContainer" containerID="7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9" Oct 02 11:37:56 crc kubenswrapper[4658]: E1002 11:37:56.924436 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9\": container with ID starting with 7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9 not found: ID does not exist" containerID="7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.924487 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9"} err="failed to get container status \"7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9\": rpc error: code = NotFound desc = could not find container \"7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9\": container with ID starting with 7c8d6b1d439a92f842fc33716f27df56ec9e0fa1cbb03b8ca31d8d0dfb8146e9 not found: ID does not exist" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.924515 4658 scope.go:117] "RemoveContainer" containerID="38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e" Oct 02 11:37:56 crc kubenswrapper[4658]: E1002 11:37:56.925080 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e\": container with ID starting with 38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e not found: ID does not exist" containerID="38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.925104 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e"} err="failed to get container status \"38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e\": rpc error: code = NotFound desc = could not find container \"38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e\": container with ID starting with 38e464a1ccf2312fe18b2945fa10973dd347854dad33709a73b9eac80d97c68e not found: ID does not exist" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.928247 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.975261 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.975338 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-scripts\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.975435 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-config-data\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.975482 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjzmt\" (UniqueName: \"kubernetes.io/projected/efbe9a47-e907-4393-8f6a-9e1a824383f4-kube-api-access-bjzmt\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.975531 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efbe9a47-e907-4393-8f6a-9e1a824383f4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:56 crc kubenswrapper[4658]: I1002 11:37:56.975675 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.087933 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-config-data\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.088269 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjzmt\" (UniqueName: \"kubernetes.io/projected/efbe9a47-e907-4393-8f6a-9e1a824383f4-kube-api-access-bjzmt\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.088325 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efbe9a47-e907-4393-8f6a-9e1a824383f4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.088473 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.088574 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.088601 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-scripts\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.089414 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efbe9a47-e907-4393-8f6a-9e1a824383f4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.094197 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.095163 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.096908 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-config-data\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.122875 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efbe9a47-e907-4393-8f6a-9e1a824383f4-scripts\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.139927 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjzmt\" (UniqueName: \"kubernetes.io/projected/efbe9a47-e907-4393-8f6a-9e1a824383f4-kube-api-access-bjzmt\") pod \"cinder-scheduler-0\" (UID: \"efbe9a47-e907-4393-8f6a-9e1a824383f4\") " pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.224821 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.430603 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.431014 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.715770 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.812983 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"efbe9a47-e907-4393-8f6a-9e1a824383f4","Type":"ContainerStarted","Data":"7a597b537f5cd3fa7886531a9c68d317ec18d84f7d955debec5b54d678086244"} Oct 02 11:37:57 crc kubenswrapper[4658]: I1002 11:37:57.976209 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b027647-05c5-4977-a5b8-498cc9cc5dc1" path="/var/lib/kubelet/pods/5b027647-05c5-4977-a5b8-498cc9cc5dc1/volumes" Oct 02 11:37:58 crc kubenswrapper[4658]: I1002 11:37:58.493448 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b994d9586-748rf" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": read tcp 10.217.0.2:52570->10.217.0.176:9311: read: connection reset by peer" Oct 02 11:37:58 crc kubenswrapper[4658]: I1002 11:37:58.494335 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b994d9586-748rf" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": read tcp 10.217.0.2:52578->10.217.0.176:9311: read: connection reset by peer" Oct 02 11:37:58 crc kubenswrapper[4658]: I1002 11:37:58.875895 4658 generic.go:334] "Generic (PLEG): container finished" podID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerID="564592b1e8dfa07ab07844b9184cf713b55f93e26bc4041812e4b1c9a6fb5b8b" exitCode=0 Oct 02 11:37:58 crc kubenswrapper[4658]: I1002 11:37:58.875964 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b994d9586-748rf" event={"ID":"39084258-a9f4-4b1e-9a7c-d0e622c39479","Type":"ContainerDied","Data":"564592b1e8dfa07ab07844b9184cf713b55f93e26bc4041812e4b1c9a6fb5b8b"} Oct 02 11:37:58 crc kubenswrapper[4658]: I1002 11:37:58.881074 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"efbe9a47-e907-4393-8f6a-9e1a824383f4","Type":"ContainerStarted","Data":"dc2cc741789d96222c21fe860a7a22095c52dcbbdcb4b643ff484a4cbefcbc0b"} Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.066439 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.146418 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data\") pod \"39084258-a9f4-4b1e-9a7c-d0e622c39479\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.146491 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data-custom\") pod \"39084258-a9f4-4b1e-9a7c-d0e622c39479\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.146570 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39084258-a9f4-4b1e-9a7c-d0e622c39479-logs\") pod \"39084258-a9f4-4b1e-9a7c-d0e622c39479\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.146597 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-combined-ca-bundle\") pod \"39084258-a9f4-4b1e-9a7c-d0e622c39479\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.146653 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pdt2\" (UniqueName: \"kubernetes.io/projected/39084258-a9f4-4b1e-9a7c-d0e622c39479-kube-api-access-5pdt2\") pod \"39084258-a9f4-4b1e-9a7c-d0e622c39479\" (UID: \"39084258-a9f4-4b1e-9a7c-d0e622c39479\") " Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.149050 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39084258-a9f4-4b1e-9a7c-d0e622c39479-logs" (OuterVolumeSpecName: "logs") pod "39084258-a9f4-4b1e-9a7c-d0e622c39479" (UID: "39084258-a9f4-4b1e-9a7c-d0e622c39479"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.154562 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "39084258-a9f4-4b1e-9a7c-d0e622c39479" (UID: "39084258-a9f4-4b1e-9a7c-d0e622c39479"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.155523 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39084258-a9f4-4b1e-9a7c-d0e622c39479-kube-api-access-5pdt2" (OuterVolumeSpecName: "kube-api-access-5pdt2") pod "39084258-a9f4-4b1e-9a7c-d0e622c39479" (UID: "39084258-a9f4-4b1e-9a7c-d0e622c39479"). InnerVolumeSpecName "kube-api-access-5pdt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.186550 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39084258-a9f4-4b1e-9a7c-d0e622c39479" (UID: "39084258-a9f4-4b1e-9a7c-d0e622c39479"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.233906 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data" (OuterVolumeSpecName: "config-data") pod "39084258-a9f4-4b1e-9a7c-d0e622c39479" (UID: "39084258-a9f4-4b1e-9a7c-d0e622c39479"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.250466 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pdt2\" (UniqueName: \"kubernetes.io/projected/39084258-a9f4-4b1e-9a7c-d0e622c39479-kube-api-access-5pdt2\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.250499 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.250508 4658 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.250516 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39084258-a9f4-4b1e-9a7c-d0e622c39479-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.250523 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39084258-a9f4-4b1e-9a7c-d0e622c39479-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.894145 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b994d9586-748rf" event={"ID":"39084258-a9f4-4b1e-9a7c-d0e622c39479","Type":"ContainerDied","Data":"2be84dd4e6e1a0de443a1d48da3b43c542672b9b966a41d2d8116f97baa8a2c5"} Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.894395 4658 scope.go:117] "RemoveContainer" containerID="564592b1e8dfa07ab07844b9184cf713b55f93e26bc4041812e4b1c9a6fb5b8b" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.894399 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b994d9586-748rf" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.901495 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"efbe9a47-e907-4393-8f6a-9e1a824383f4","Type":"ContainerStarted","Data":"f76303ebf449a7022e313bc2cd0797ae306f1662ee35a711f27d9c7e22f4f468"} Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.925333 4658 scope.go:117] "RemoveContainer" containerID="68b0858514747e0d4500cf25af1cfbf261b92bd552375fafbecb4cfcd700f6af" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.926387 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.926367085 podStartE2EDuration="3.926367085s" podCreationTimestamp="2025-10-02 11:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:37:59.923930827 +0000 UTC m=+1160.815084414" watchObservedRunningTime="2025-10-02 11:37:59.926367085 +0000 UTC m=+1160.817520652" Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.945909 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b994d9586-748rf"] Oct 02 11:37:59 crc kubenswrapper[4658]: I1002 11:37:59.966055 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6b994d9586-748rf"] Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.331018 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5566488b4c-k88mg"] Oct 02 11:38:01 crc kubenswrapper[4658]: E1002 11:38:01.331831 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.331850 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api" Oct 02 11:38:01 crc kubenswrapper[4658]: E1002 11:38:01.331890 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api-log" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.331898 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api-log" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.332129 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.332149 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" containerName="barbican-api-log" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.333498 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.340393 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.341124 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.342128 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.356633 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5566488b4c-k88mg"] Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.395075 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.409523 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-config-data\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.409582 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67435e65-47df-41df-9570-df74c35bd5fc-log-httpd\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.409627 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-combined-ca-bundle\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.409704 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67435e65-47df-41df-9570-df74c35bd5fc-run-httpd\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.409740 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67435e65-47df-41df-9570-df74c35bd5fc-etc-swift\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.409816 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-internal-tls-certs\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.409906 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-public-tls-certs\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.409949 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mvb\" (UniqueName: \"kubernetes.io/projected/67435e65-47df-41df-9570-df74c35bd5fc-kube-api-access-88mvb\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.511350 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-internal-tls-certs\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.511755 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-public-tls-certs\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.511784 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mvb\" (UniqueName: \"kubernetes.io/projected/67435e65-47df-41df-9570-df74c35bd5fc-kube-api-access-88mvb\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.511812 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-config-data\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.511833 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67435e65-47df-41df-9570-df74c35bd5fc-log-httpd\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.511864 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-combined-ca-bundle\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.511950 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67435e65-47df-41df-9570-df74c35bd5fc-run-httpd\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.512005 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67435e65-47df-41df-9570-df74c35bd5fc-etc-swift\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.513989 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67435e65-47df-41df-9570-df74c35bd5fc-log-httpd\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.514429 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67435e65-47df-41df-9570-df74c35bd5fc-run-httpd\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.517909 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-config-data\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.527150 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-combined-ca-bundle\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.527955 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-public-tls-certs\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.531586 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67435e65-47df-41df-9570-df74c35bd5fc-etc-swift\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.538412 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mvb\" (UniqueName: \"kubernetes.io/projected/67435e65-47df-41df-9570-df74c35bd5fc-kube-api-access-88mvb\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.540197 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67435e65-47df-41df-9570-df74c35bd5fc-internal-tls-certs\") pod \"swift-proxy-5566488b4c-k88mg\" (UID: \"67435e65-47df-41df-9570-df74c35bd5fc\") " pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.614835 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.615281 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerName="glance-log" containerID="cri-o://fd3c33c27c7fdf827fbdc2f57eeac55ed71e2e8399a0b6def13bbd35922bfcd3" gracePeriod=30 Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.615379 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerName="glance-httpd" containerID="cri-o://37ddc0308a6193f1af522a7a86165fb4b03469c586273799f8f7fb5e8aa5a151" gracePeriod=30 Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.659198 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.953631 4658 generic.go:334] "Generic (PLEG): container finished" podID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerID="fd3c33c27c7fdf827fbdc2f57eeac55ed71e2e8399a0b6def13bbd35922bfcd3" exitCode=143 Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.966667 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39084258-a9f4-4b1e-9a7c-d0e622c39479" path="/var/lib/kubelet/pods/39084258-a9f4-4b1e-9a7c-d0e622c39479/volumes" Oct 02 11:38:01 crc kubenswrapper[4658]: I1002 11:38:01.967613 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0f090b2-4ffc-4a27-b8ee-52a6912bf436","Type":"ContainerDied","Data":"fd3c33c27c7fdf827fbdc2f57eeac55ed71e2e8399a0b6def13bbd35922bfcd3"} Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.229318 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.639437 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.647848 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="ceilometer-central-agent" containerID="cri-o://852959d62e49ef499d3c00e9dd5c186978699ff3790325796250c0d73ec328c0" gracePeriod=30 Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.647901 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="sg-core" containerID="cri-o://f328f894b667b9f07305574c062db679980c25137e21e4eda2db2c6d40c738aa" gracePeriod=30 Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.647988 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="proxy-httpd" containerID="cri-o://bd3cd22dc3f41d16382612e731cbb2ee2836e096a7b11f9f92e2834d0e0e7fa0" gracePeriod=30 Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.647982 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="ceilometer-notification-agent" containerID="cri-o://b3ad37ad31309db8ad340b99cb49efb6c80cb8c124e588849eeabfb378663d32" gracePeriod=30 Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.979017 4658 generic.go:334] "Generic (PLEG): container finished" podID="d2760468-fb22-4275-8906-8bc5981ab243" containerID="bd3cd22dc3f41d16382612e731cbb2ee2836e096a7b11f9f92e2834d0e0e7fa0" exitCode=0 Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.979406 4658 generic.go:334] "Generic (PLEG): container finished" podID="d2760468-fb22-4275-8906-8bc5981ab243" containerID="f328f894b667b9f07305574c062db679980c25137e21e4eda2db2c6d40c738aa" exitCode=2 Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.979172 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerDied","Data":"bd3cd22dc3f41d16382612e731cbb2ee2836e096a7b11f9f92e2834d0e0e7fa0"} Oct 02 11:38:02 crc kubenswrapper[4658]: I1002 11:38:02.979441 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerDied","Data":"f328f894b667b9f07305574c062db679980c25137e21e4eda2db2c6d40c738aa"} Oct 02 11:38:03 crc kubenswrapper[4658]: I1002 11:38:03.992855 4658 generic.go:334] "Generic (PLEG): container finished" podID="d2760468-fb22-4275-8906-8bc5981ab243" containerID="b3ad37ad31309db8ad340b99cb49efb6c80cb8c124e588849eeabfb378663d32" exitCode=0 Oct 02 11:38:03 crc kubenswrapper[4658]: I1002 11:38:03.992901 4658 generic.go:334] "Generic (PLEG): container finished" podID="d2760468-fb22-4275-8906-8bc5981ab243" containerID="852959d62e49ef499d3c00e9dd5c186978699ff3790325796250c0d73ec328c0" exitCode=0 Oct 02 11:38:03 crc kubenswrapper[4658]: I1002 11:38:03.992932 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerDied","Data":"b3ad37ad31309db8ad340b99cb49efb6c80cb8c124e588849eeabfb378663d32"} Oct 02 11:38:03 crc kubenswrapper[4658]: I1002 11:38:03.993002 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerDied","Data":"852959d62e49ef499d3c00e9dd5c186978699ff3790325796250c0d73ec328c0"} Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.342283 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mrrgm"] Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.346474 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mrrgm" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.367181 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mrrgm"] Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.444767 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-85tpz"] Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.445951 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-85tpz" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.475932 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-85tpz"] Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.484148 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpfxg\" (UniqueName: \"kubernetes.io/projected/08cd2749-20f1-4836-ac61-62b7d555a3b3-kube-api-access-kpfxg\") pod \"nova-api-db-create-mrrgm\" (UID: \"08cd2749-20f1-4836-ac61-62b7d555a3b3\") " pod="openstack/nova-api-db-create-mrrgm" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.551577 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-gt5m2"] Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.553147 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gt5m2" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.562289 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gt5m2"] Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.586251 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpfxg\" (UniqueName: \"kubernetes.io/projected/08cd2749-20f1-4836-ac61-62b7d555a3b3-kube-api-access-kpfxg\") pod \"nova-api-db-create-mrrgm\" (UID: \"08cd2749-20f1-4836-ac61-62b7d555a3b3\") " pod="openstack/nova-api-db-create-mrrgm" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.586400 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llhcl\" (UniqueName: \"kubernetes.io/projected/6473d21f-8a15-443f-b5ac-2211e1cf0e55-kube-api-access-llhcl\") pod \"nova-cell0-db-create-85tpz\" (UID: \"6473d21f-8a15-443f-b5ac-2211e1cf0e55\") " pod="openstack/nova-cell0-db-create-85tpz" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.609435 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpfxg\" (UniqueName: \"kubernetes.io/projected/08cd2749-20f1-4836-ac61-62b7d555a3b3-kube-api-access-kpfxg\") pod \"nova-api-db-create-mrrgm\" (UID: \"08cd2749-20f1-4836-ac61-62b7d555a3b3\") " pod="openstack/nova-api-db-create-mrrgm" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.664217 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mrrgm" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.687659 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkhjv\" (UniqueName: \"kubernetes.io/projected/5806c84d-2c8f-402d-9487-656bd2936933-kube-api-access-wkhjv\") pod \"nova-cell1-db-create-gt5m2\" (UID: \"5806c84d-2c8f-402d-9487-656bd2936933\") " pod="openstack/nova-cell1-db-create-gt5m2" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.687713 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llhcl\" (UniqueName: \"kubernetes.io/projected/6473d21f-8a15-443f-b5ac-2211e1cf0e55-kube-api-access-llhcl\") pod \"nova-cell0-db-create-85tpz\" (UID: \"6473d21f-8a15-443f-b5ac-2211e1cf0e55\") " pod="openstack/nova-cell0-db-create-85tpz" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.707548 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llhcl\" (UniqueName: \"kubernetes.io/projected/6473d21f-8a15-443f-b5ac-2211e1cf0e55-kube-api-access-llhcl\") pod \"nova-cell0-db-create-85tpz\" (UID: \"6473d21f-8a15-443f-b5ac-2211e1cf0e55\") " pod="openstack/nova-cell0-db-create-85tpz" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.764893 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-85tpz" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.790001 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkhjv\" (UniqueName: \"kubernetes.io/projected/5806c84d-2c8f-402d-9487-656bd2936933-kube-api-access-wkhjv\") pod \"nova-cell1-db-create-gt5m2\" (UID: \"5806c84d-2c8f-402d-9487-656bd2936933\") " pod="openstack/nova-cell1-db-create-gt5m2" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.809276 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkhjv\" (UniqueName: \"kubernetes.io/projected/5806c84d-2c8f-402d-9487-656bd2936933-kube-api-access-wkhjv\") pod \"nova-cell1-db-create-gt5m2\" (UID: \"5806c84d-2c8f-402d-9487-656bd2936933\") " pod="openstack/nova-cell1-db-create-gt5m2" Oct 02 11:38:04 crc kubenswrapper[4658]: I1002 11:38:04.882732 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gt5m2" Oct 02 11:38:05 crc kubenswrapper[4658]: I1002 11:38:05.017651 4658 generic.go:334] "Generic (PLEG): container finished" podID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerID="37ddc0308a6193f1af522a7a86165fb4b03469c586273799f8f7fb5e8aa5a151" exitCode=0 Oct 02 11:38:05 crc kubenswrapper[4658]: I1002 11:38:05.017694 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0f090b2-4ffc-4a27-b8ee-52a6912bf436","Type":"ContainerDied","Data":"37ddc0308a6193f1af522a7a86165fb4b03469c586273799f8f7fb5e8aa5a151"} Oct 02 11:38:06 crc kubenswrapper[4658]: I1002 11:38:06.932739 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:38:06 crc kubenswrapper[4658]: I1002 11:38:06.933350 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="546e3884-d904-4d23-853e-6855aee00e02" containerName="glance-log" containerID="cri-o://502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2" gracePeriod=30 Oct 02 11:38:06 crc kubenswrapper[4658]: I1002 11:38:06.933444 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="546e3884-d904-4d23-853e-6855aee00e02" containerName="glance-httpd" containerID="cri-o://9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0" gracePeriod=30 Oct 02 11:38:07 crc kubenswrapper[4658]: I1002 11:38:07.461954 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.063916 4658 generic.go:334] "Generic (PLEG): container finished" podID="546e3884-d904-4d23-853e-6855aee00e02" containerID="502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2" exitCode=143 Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.064115 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"546e3884-d904-4d23-853e-6855aee00e02","Type":"ContainerDied","Data":"502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2"} Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.535999 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.641142 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.663060 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-run-httpd\") pod \"d2760468-fb22-4275-8906-8bc5981ab243\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.663132 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-combined-ca-bundle\") pod \"d2760468-fb22-4275-8906-8bc5981ab243\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.663237 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-scripts\") pod \"d2760468-fb22-4275-8906-8bc5981ab243\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.663255 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-log-httpd\") pod \"d2760468-fb22-4275-8906-8bc5981ab243\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.663374 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d42m4\" (UniqueName: \"kubernetes.io/projected/d2760468-fb22-4275-8906-8bc5981ab243-kube-api-access-d42m4\") pod \"d2760468-fb22-4275-8906-8bc5981ab243\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.663407 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-config-data\") pod \"d2760468-fb22-4275-8906-8bc5981ab243\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.663450 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-sg-core-conf-yaml\") pod \"d2760468-fb22-4275-8906-8bc5981ab243\" (UID: \"d2760468-fb22-4275-8906-8bc5981ab243\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.663916 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2760468-fb22-4275-8906-8bc5981ab243" (UID: "d2760468-fb22-4275-8906-8bc5981ab243"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.664027 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2760468-fb22-4275-8906-8bc5981ab243" (UID: "d2760468-fb22-4275-8906-8bc5981ab243"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.670329 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2760468-fb22-4275-8906-8bc5981ab243-kube-api-access-d42m4" (OuterVolumeSpecName: "kube-api-access-d42m4") pod "d2760468-fb22-4275-8906-8bc5981ab243" (UID: "d2760468-fb22-4275-8906-8bc5981ab243"). InnerVolumeSpecName "kube-api-access-d42m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.671050 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-scripts" (OuterVolumeSpecName: "scripts") pod "d2760468-fb22-4275-8906-8bc5981ab243" (UID: "d2760468-fb22-4275-8906-8bc5981ab243"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.716144 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2760468-fb22-4275-8906-8bc5981ab243" (UID: "d2760468-fb22-4275-8906-8bc5981ab243"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.764694 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-httpd-run\") pod \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.764741 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg2rg\" (UniqueName: \"kubernetes.io/projected/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-kube-api-access-lg2rg\") pod \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.764799 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.764871 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-combined-ca-bundle\") pod \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.764980 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-internal-tls-certs\") pod \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.765049 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-logs\") pod \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.765084 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-config-data\") pod \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.765185 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0f090b2-4ffc-4a27-b8ee-52a6912bf436" (UID: "a0f090b2-4ffc-4a27-b8ee-52a6912bf436"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.765180 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-scripts\") pod \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\" (UID: \"a0f090b2-4ffc-4a27-b8ee-52a6912bf436\") " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.765518 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-logs" (OuterVolumeSpecName: "logs") pod "a0f090b2-4ffc-4a27-b8ee-52a6912bf436" (UID: "a0f090b2-4ffc-4a27-b8ee-52a6912bf436"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.766056 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.766081 4658 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.766095 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d42m4\" (UniqueName: \"kubernetes.io/projected/d2760468-fb22-4275-8906-8bc5981ab243-kube-api-access-d42m4\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.766109 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.766120 4658 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.766131 4658 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2760468-fb22-4275-8906-8bc5981ab243-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.766143 4658 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.770670 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-scripts" (OuterVolumeSpecName: "scripts") pod "a0f090b2-4ffc-4a27-b8ee-52a6912bf436" (UID: "a0f090b2-4ffc-4a27-b8ee-52a6912bf436"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.771133 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "a0f090b2-4ffc-4a27-b8ee-52a6912bf436" (UID: "a0f090b2-4ffc-4a27-b8ee-52a6912bf436"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.771442 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-kube-api-access-lg2rg" (OuterVolumeSpecName: "kube-api-access-lg2rg") pod "a0f090b2-4ffc-4a27-b8ee-52a6912bf436" (UID: "a0f090b2-4ffc-4a27-b8ee-52a6912bf436"). InnerVolumeSpecName "kube-api-access-lg2rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.776475 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2760468-fb22-4275-8906-8bc5981ab243" (UID: "d2760468-fb22-4275-8906-8bc5981ab243"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.815196 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-config-data" (OuterVolumeSpecName: "config-data") pod "d2760468-fb22-4275-8906-8bc5981ab243" (UID: "d2760468-fb22-4275-8906-8bc5981ab243"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.833961 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0f090b2-4ffc-4a27-b8ee-52a6912bf436" (UID: "a0f090b2-4ffc-4a27-b8ee-52a6912bf436"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.844507 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-config-data" (OuterVolumeSpecName: "config-data") pod "a0f090b2-4ffc-4a27-b8ee-52a6912bf436" (UID: "a0f090b2-4ffc-4a27-b8ee-52a6912bf436"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.844678 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0f090b2-4ffc-4a27-b8ee-52a6912bf436" (UID: "a0f090b2-4ffc-4a27-b8ee-52a6912bf436"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.867521 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.867563 4658 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.867576 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.867589 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.867602 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.867613 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg2rg\" (UniqueName: \"kubernetes.io/projected/a0f090b2-4ffc-4a27-b8ee-52a6912bf436-kube-api-access-lg2rg\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.867625 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2760468-fb22-4275-8906-8bc5981ab243-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.867678 4658 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.895615 4658 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.968008 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mrrgm"] Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.969567 4658 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:08 crc kubenswrapper[4658]: I1002 11:38:08.978232 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-85tpz"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.000040 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gt5m2"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.079615 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5566488b4c-k88mg"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.085287 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gt5m2" event={"ID":"5806c84d-2c8f-402d-9487-656bd2936933","Type":"ContainerStarted","Data":"fb90397f9ca26addbd657051ece598417ce1d615e4557392e3305a1970222cfd"} Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.086553 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mrrgm" event={"ID":"08cd2749-20f1-4836-ac61-62b7d555a3b3","Type":"ContainerStarted","Data":"a64ce5bbddf0a0c017592b26cd642a4b7265b3c0eff0941755fac2c12ba195b3"} Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.091596 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.091624 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0f090b2-4ffc-4a27-b8ee-52a6912bf436","Type":"ContainerDied","Data":"090de77937cd45f43ea44834270934c1abf42921e83bbcac941b3e246beb5d1c"} Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.091723 4658 scope.go:117] "RemoveContainer" containerID="37ddc0308a6193f1af522a7a86165fb4b03469c586273799f8f7fb5e8aa5a151" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.107628 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-85tpz" event={"ID":"6473d21f-8a15-443f-b5ac-2211e1cf0e55","Type":"ContainerStarted","Data":"f094dac3c8a93e46d4bc268246c0aac1d540a2a702b826b05047aa65728ee118"} Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.112054 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2760468-fb22-4275-8906-8bc5981ab243","Type":"ContainerDied","Data":"a87c22b84f32e87b02364670800850ac243de891f636577ab0b961ac332476e8"} Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.112181 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.114714 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"53d4842f-7f97-4191-bcea-c8076517503f","Type":"ContainerStarted","Data":"44bab8a3271a804e94dabeb15dfda755081fd96866d6df820dbceb36e70bcc5f"} Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.133080 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.322278768 podStartE2EDuration="16.133064865s" podCreationTimestamp="2025-10-02 11:37:53 +0000 UTC" firstStartedPulling="2025-10-02 11:37:54.28920535 +0000 UTC m=+1155.180358917" lastFinishedPulling="2025-10-02 11:38:08.099991447 +0000 UTC m=+1168.991145014" observedRunningTime="2025-10-02 11:38:09.13232161 +0000 UTC m=+1170.023475197" watchObservedRunningTime="2025-10-02 11:38:09.133064865 +0000 UTC m=+1170.024218432" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.181157 4658 scope.go:117] "RemoveContainer" containerID="fd3c33c27c7fdf827fbdc2f57eeac55ed71e2e8399a0b6def13bbd35922bfcd3" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.230447 4658 scope.go:117] "RemoveContainer" containerID="bd3cd22dc3f41d16382612e731cbb2ee2836e096a7b11f9f92e2834d0e0e7fa0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.243201 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.264360 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.300249 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.316197 4658 scope.go:117] "RemoveContainer" containerID="f328f894b667b9f07305574c062db679980c25137e21e4eda2db2c6d40c738aa" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.326555 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.350284 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:38:09 crc kubenswrapper[4658]: E1002 11:38:09.350771 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerName="glance-log" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.350791 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerName="glance-log" Oct 02 11:38:09 crc kubenswrapper[4658]: E1002 11:38:09.350803 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="sg-core" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.350811 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="sg-core" Oct 02 11:38:09 crc kubenswrapper[4658]: E1002 11:38:09.350831 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerName="glance-httpd" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.350839 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerName="glance-httpd" Oct 02 11:38:09 crc kubenswrapper[4658]: E1002 11:38:09.350853 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="ceilometer-central-agent" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.350862 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="ceilometer-central-agent" Oct 02 11:38:09 crc kubenswrapper[4658]: E1002 11:38:09.350874 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="proxy-httpd" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.350882 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="proxy-httpd" Oct 02 11:38:09 crc kubenswrapper[4658]: E1002 11:38:09.350903 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="ceilometer-notification-agent" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.350911 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="ceilometer-notification-agent" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.351163 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="ceilometer-notification-agent" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.351182 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="proxy-httpd" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.351199 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="sg-core" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.351208 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerName="glance-httpd" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.351220 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" containerName="glance-log" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.351229 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2760468-fb22-4275-8906-8bc5981ab243" containerName="ceilometer-central-agent" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.352384 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.354764 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.359038 4658 scope.go:117] "RemoveContainer" containerID="b3ad37ad31309db8ad340b99cb49efb6c80cb8c124e588849eeabfb378663d32" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.359824 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.361056 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.365874 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.372527 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.373903 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.374385 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.381661 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.382199 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.382368 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.382889 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6306f11-af13-4078-ad43-b00e333855b1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.383005 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.383077 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.383164 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcpzv\" (UniqueName: \"kubernetes.io/projected/d6306f11-af13-4078-ad43-b00e333855b1-kube-api-access-pcpzv\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.383244 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6306f11-af13-4078-ad43-b00e333855b1-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.383716 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.400454 4658 scope.go:117] "RemoveContainer" containerID="852959d62e49ef499d3c00e9dd5c186978699ff3790325796250c0d73ec328c0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.484637 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6306f11-af13-4078-ad43-b00e333855b1-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.484740 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ff8g\" (UniqueName: \"kubernetes.io/projected/3f015afb-5dc8-4a46-908a-2df29e61c05f-kube-api-access-4ff8g\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.484766 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-run-httpd\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.484790 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.484818 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-config-data\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.484844 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-log-httpd\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.484891 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.484934 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.484958 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.485080 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.485106 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6306f11-af13-4078-ad43-b00e333855b1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.485139 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.485165 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.485190 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-scripts\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.485214 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcpzv\" (UniqueName: \"kubernetes.io/projected/d6306f11-af13-4078-ad43-b00e333855b1-kube-api-access-pcpzv\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.485899 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6306f11-af13-4078-ad43-b00e333855b1-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.487835 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.489758 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6306f11-af13-4078-ad43-b00e333855b1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.494770 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.499570 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.501837 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.504025 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6306f11-af13-4078-ad43-b00e333855b1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.517280 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcpzv\" (UniqueName: \"kubernetes.io/projected/d6306f11-af13-4078-ad43-b00e333855b1-kube-api-access-pcpzv\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.554181 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6306f11-af13-4078-ad43-b00e333855b1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.587597 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-scripts\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.587698 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ff8g\" (UniqueName: \"kubernetes.io/projected/3f015afb-5dc8-4a46-908a-2df29e61c05f-kube-api-access-4ff8g\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.587719 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-run-httpd\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.587769 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-config-data\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.587792 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-log-httpd\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.587839 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.587883 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.588914 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-log-httpd\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.588982 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-run-httpd\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.591889 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-scripts\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.598281 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-config-data\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.598673 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.601799 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.621896 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ff8g\" (UniqueName: \"kubernetes.io/projected/3f015afb-5dc8-4a46-908a-2df29e61c05f-kube-api-access-4ff8g\") pod \"ceilometer-0\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.704806 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.716956 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.969227 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f090b2-4ffc-4a27-b8ee-52a6912bf436" path="/var/lib/kubelet/pods/a0f090b2-4ffc-4a27-b8ee-52a6912bf436/volumes" Oct 02 11:38:09 crc kubenswrapper[4658]: I1002 11:38:09.970410 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2760468-fb22-4275-8906-8bc5981ab243" path="/var/lib/kubelet/pods/d2760468-fb22-4275-8906-8bc5981ab243/volumes" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.137242 4658 generic.go:334] "Generic (PLEG): container finished" podID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerID="6d45f089b45e50f886b377a7177e755f763adec478d0b95d9b7dd867cd3a61a8" exitCode=137 Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.137344 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dbf7b8b8b-kj6xr" event={"ID":"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2","Type":"ContainerDied","Data":"6d45f089b45e50f886b377a7177e755f763adec478d0b95d9b7dd867cd3a61a8"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.137723 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dbf7b8b8b-kj6xr" event={"ID":"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2","Type":"ContainerStarted","Data":"19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.145861 4658 generic.go:334] "Generic (PLEG): container finished" podID="5806c84d-2c8f-402d-9487-656bd2936933" containerID="f9b1fba6315acfc7dd04f38b51428968bfb5073789c75961311d897284c21eaa" exitCode=0 Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.145911 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gt5m2" event={"ID":"5806c84d-2c8f-402d-9487-656bd2936933","Type":"ContainerDied","Data":"f9b1fba6315acfc7dd04f38b51428968bfb5073789c75961311d897284c21eaa"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.148607 4658 generic.go:334] "Generic (PLEG): container finished" podID="02408c48-14d8-4a7b-8ebf-79fd2fa1b924" containerID="b902a68948536244db8695a6e4dd9a6e647d1be696ee4baa78124a8553dfffab" exitCode=137 Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.148657 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-776f4bfd7b-cm7vj" event={"ID":"02408c48-14d8-4a7b-8ebf-79fd2fa1b924","Type":"ContainerDied","Data":"b902a68948536244db8695a6e4dd9a6e647d1be696ee4baa78124a8553dfffab"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.148678 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-776f4bfd7b-cm7vj" event={"ID":"02408c48-14d8-4a7b-8ebf-79fd2fa1b924","Type":"ContainerStarted","Data":"d48b455b14fa5eee32e056d67359f08e1f1c52a8fbed4cc6bada3c5d8ce05d9f"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.167736 4658 generic.go:334] "Generic (PLEG): container finished" podID="08cd2749-20f1-4836-ac61-62b7d555a3b3" containerID="c8979b246183ec945438779ca471a0c822952f7e550171d5eafbd4ef9e5fdb26" exitCode=0 Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.167816 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mrrgm" event={"ID":"08cd2749-20f1-4836-ac61-62b7d555a3b3","Type":"ContainerDied","Data":"c8979b246183ec945438779ca471a0c822952f7e550171d5eafbd4ef9e5fdb26"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.183102 4658 generic.go:334] "Generic (PLEG): container finished" podID="6473d21f-8a15-443f-b5ac-2211e1cf0e55" containerID="cbe52a82df437a48099dba564b91a3aae02a948123a6b805fa211187fc907aa6" exitCode=0 Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.183194 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-85tpz" event={"ID":"6473d21f-8a15-443f-b5ac-2211e1cf0e55","Type":"ContainerDied","Data":"cbe52a82df437a48099dba564b91a3aae02a948123a6b805fa211187fc907aa6"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.204829 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5566488b4c-k88mg" event={"ID":"67435e65-47df-41df-9570-df74c35bd5fc","Type":"ContainerStarted","Data":"3a3e9eb74349a9051518734a73615a5ec891919fbc420cfed001c01ade1f6130"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.204861 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5566488b4c-k88mg" event={"ID":"67435e65-47df-41df-9570-df74c35bd5fc","Type":"ContainerStarted","Data":"c8d4e1a6b5ebf83d3e4b7b91de1f1cdba5b5b80be254a6293cabe01032a594e8"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.204872 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5566488b4c-k88mg" event={"ID":"67435e65-47df-41df-9570-df74c35bd5fc","Type":"ContainerStarted","Data":"4a60f890f2154b186e553802eb93e64b3586ee22c9223e659f6fd798f3223e8b"} Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.204885 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.204907 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.257069 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5566488b4c-k88mg" podStartSLOduration=9.257044977 podStartE2EDuration="9.257044977s" podCreationTimestamp="2025-10-02 11:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:38:10.251800016 +0000 UTC m=+1171.142953583" watchObservedRunningTime="2025-10-02 11:38:10.257044977 +0000 UTC m=+1171.148198544" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.289084 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.485535 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:38:10 crc kubenswrapper[4658]: W1002 11:38:10.493723 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6306f11_af13_4078_ad43_b00e333855b1.slice/crio-29386e356b9e11db098aaae8c6ba539b8b7c44c1aa7456cf8b78838702ae0dc8 WatchSource:0}: Error finding container 29386e356b9e11db098aaae8c6ba539b8b7c44c1aa7456cf8b78838702ae0dc8: Status 404 returned error can't find the container with id 29386e356b9e11db098aaae8c6ba539b8b7c44c1aa7456cf8b78838702ae0dc8 Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.706765 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.825252 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-scripts\") pod \"546e3884-d904-4d23-853e-6855aee00e02\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.825343 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc72m\" (UniqueName: \"kubernetes.io/projected/546e3884-d904-4d23-853e-6855aee00e02-kube-api-access-fc72m\") pod \"546e3884-d904-4d23-853e-6855aee00e02\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.825410 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-logs\") pod \"546e3884-d904-4d23-853e-6855aee00e02\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.825502 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-httpd-run\") pod \"546e3884-d904-4d23-853e-6855aee00e02\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.825587 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-combined-ca-bundle\") pod \"546e3884-d904-4d23-853e-6855aee00e02\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.825632 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"546e3884-d904-4d23-853e-6855aee00e02\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.825697 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-public-tls-certs\") pod \"546e3884-d904-4d23-853e-6855aee00e02\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.825737 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-config-data\") pod \"546e3884-d904-4d23-853e-6855aee00e02\" (UID: \"546e3884-d904-4d23-853e-6855aee00e02\") " Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.827171 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "546e3884-d904-4d23-853e-6855aee00e02" (UID: "546e3884-d904-4d23-853e-6855aee00e02"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.830390 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-logs" (OuterVolumeSpecName: "logs") pod "546e3884-d904-4d23-853e-6855aee00e02" (UID: "546e3884-d904-4d23-853e-6855aee00e02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.831601 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/546e3884-d904-4d23-853e-6855aee00e02-kube-api-access-fc72m" (OuterVolumeSpecName: "kube-api-access-fc72m") pod "546e3884-d904-4d23-853e-6855aee00e02" (UID: "546e3884-d904-4d23-853e-6855aee00e02"). InnerVolumeSpecName "kube-api-access-fc72m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.845088 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "546e3884-d904-4d23-853e-6855aee00e02" (UID: "546e3884-d904-4d23-853e-6855aee00e02"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.860005 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-scripts" (OuterVolumeSpecName: "scripts") pod "546e3884-d904-4d23-853e-6855aee00e02" (UID: "546e3884-d904-4d23-853e-6855aee00e02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.889587 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "546e3884-d904-4d23-853e-6855aee00e02" (UID: "546e3884-d904-4d23-853e-6855aee00e02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.927974 4658 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.928007 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.928028 4658 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.928039 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.928048 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc72m\" (UniqueName: \"kubernetes.io/projected/546e3884-d904-4d23-853e-6855aee00e02-kube-api-access-fc72m\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.928058 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/546e3884-d904-4d23-853e-6855aee00e02-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.933287 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "546e3884-d904-4d23-853e-6855aee00e02" (UID: "546e3884-d904-4d23-853e-6855aee00e02"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.953999 4658 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 02 11:38:10 crc kubenswrapper[4658]: I1002 11:38:10.995425 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-config-data" (OuterVolumeSpecName: "config-data") pod "546e3884-d904-4d23-853e-6855aee00e02" (UID: "546e3884-d904-4d23-853e-6855aee00e02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.032040 4658 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.032068 4658 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.032078 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546e3884-d904-4d23-853e-6855aee00e02-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.247179 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6306f11-af13-4078-ad43-b00e333855b1","Type":"ContainerStarted","Data":"29386e356b9e11db098aaae8c6ba539b8b7c44c1aa7456cf8b78838702ae0dc8"} Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.255489 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerStarted","Data":"880c5335760d4fc3c1520a44eaf711c59df4c066e41f295070f130daddcfb202"} Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.271895 4658 generic.go:334] "Generic (PLEG): container finished" podID="546e3884-d904-4d23-853e-6855aee00e02" containerID="9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0" exitCode=0 Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.272262 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.273538 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"546e3884-d904-4d23-853e-6855aee00e02","Type":"ContainerDied","Data":"9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0"} Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.273577 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"546e3884-d904-4d23-853e-6855aee00e02","Type":"ContainerDied","Data":"98de9a6095fd432b0fd27feac437a484bc2a1a1da6f0f448799a3236211c0082"} Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.273597 4658 scope.go:117] "RemoveContainer" containerID="9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.341079 4658 scope.go:117] "RemoveContainer" containerID="502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.342970 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.365355 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.380369 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:38:11 crc kubenswrapper[4658]: E1002 11:38:11.380885 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546e3884-d904-4d23-853e-6855aee00e02" containerName="glance-log" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.380907 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="546e3884-d904-4d23-853e-6855aee00e02" containerName="glance-log" Oct 02 11:38:11 crc kubenswrapper[4658]: E1002 11:38:11.380926 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546e3884-d904-4d23-853e-6855aee00e02" containerName="glance-httpd" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.380934 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="546e3884-d904-4d23-853e-6855aee00e02" containerName="glance-httpd" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.381196 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="546e3884-d904-4d23-853e-6855aee00e02" containerName="glance-log" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.381226 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="546e3884-d904-4d23-853e-6855aee00e02" containerName="glance-httpd" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.382537 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.391683 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.393409 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.408024 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.444269 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.444318 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.444400 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-scripts\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.444443 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.444486 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-logs\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.444504 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.444538 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qhdx\" (UniqueName: \"kubernetes.io/projected/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-kube-api-access-8qhdx\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.444554 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-config-data\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.444719 4658 scope.go:117] "RemoveContainer" containerID="9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0" Oct 02 11:38:11 crc kubenswrapper[4658]: E1002 11:38:11.449555 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0\": container with ID starting with 9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0 not found: ID does not exist" containerID="9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.449592 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0"} err="failed to get container status \"9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0\": rpc error: code = NotFound desc = could not find container \"9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0\": container with ID starting with 9e4e3365e2b34b95d598b4e94b2caf92f4cd470b52ae762a6ada89d48e70cbc0 not found: ID does not exist" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.449621 4658 scope.go:117] "RemoveContainer" containerID="502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2" Oct 02 11:38:11 crc kubenswrapper[4658]: E1002 11:38:11.451491 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2\": container with ID starting with 502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2 not found: ID does not exist" containerID="502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.451521 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2"} err="failed to get container status \"502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2\": rpc error: code = NotFound desc = could not find container \"502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2\": container with ID starting with 502c1ee4fee37c5e0949c521ac9d6001694e15a0bd677350facde7626ec307a2 not found: ID does not exist" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.546716 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-logs\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.546755 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.546785 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qhdx\" (UniqueName: \"kubernetes.io/projected/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-kube-api-access-8qhdx\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.546804 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-config-data\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.546859 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.546877 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.546953 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-scripts\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.547000 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.547389 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.547509 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.549019 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-logs\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.552862 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-config-data\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.553047 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-scripts\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.554052 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.554818 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.562146 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qhdx\" (UniqueName: \"kubernetes.io/projected/67f8b15f-e190-40d6-8b7b-e8ba932f00f9-kube-api-access-8qhdx\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.588970 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"67f8b15f-e190-40d6-8b7b-e8ba932f00f9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.726827 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.806164 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-85tpz" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.852664 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llhcl\" (UniqueName: \"kubernetes.io/projected/6473d21f-8a15-443f-b5ac-2211e1cf0e55-kube-api-access-llhcl\") pod \"6473d21f-8a15-443f-b5ac-2211e1cf0e55\" (UID: \"6473d21f-8a15-443f-b5ac-2211e1cf0e55\") " Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.858000 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6473d21f-8a15-443f-b5ac-2211e1cf0e55-kube-api-access-llhcl" (OuterVolumeSpecName: "kube-api-access-llhcl") pod "6473d21f-8a15-443f-b5ac-2211e1cf0e55" (UID: "6473d21f-8a15-443f-b5ac-2211e1cf0e55"). InnerVolumeSpecName "kube-api-access-llhcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.956388 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llhcl\" (UniqueName: \"kubernetes.io/projected/6473d21f-8a15-443f-b5ac-2211e1cf0e55-kube-api-access-llhcl\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:11 crc kubenswrapper[4658]: I1002 11:38:11.998576 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="546e3884-d904-4d23-853e-6855aee00e02" path="/var/lib/kubelet/pods/546e3884-d904-4d23-853e-6855aee00e02/volumes" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.124364 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.150960 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gt5m2" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.174507 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mrrgm" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.262943 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpfxg\" (UniqueName: \"kubernetes.io/projected/08cd2749-20f1-4836-ac61-62b7d555a3b3-kube-api-access-kpfxg\") pod \"08cd2749-20f1-4836-ac61-62b7d555a3b3\" (UID: \"08cd2749-20f1-4836-ac61-62b7d555a3b3\") " Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.263041 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkhjv\" (UniqueName: \"kubernetes.io/projected/5806c84d-2c8f-402d-9487-656bd2936933-kube-api-access-wkhjv\") pod \"5806c84d-2c8f-402d-9487-656bd2936933\" (UID: \"5806c84d-2c8f-402d-9487-656bd2936933\") " Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.268924 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08cd2749-20f1-4836-ac61-62b7d555a3b3-kube-api-access-kpfxg" (OuterVolumeSpecName: "kube-api-access-kpfxg") pod "08cd2749-20f1-4836-ac61-62b7d555a3b3" (UID: "08cd2749-20f1-4836-ac61-62b7d555a3b3"). InnerVolumeSpecName "kube-api-access-kpfxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.270217 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5806c84d-2c8f-402d-9487-656bd2936933-kube-api-access-wkhjv" (OuterVolumeSpecName: "kube-api-access-wkhjv") pod "5806c84d-2c8f-402d-9487-656bd2936933" (UID: "5806c84d-2c8f-402d-9487-656bd2936933"). InnerVolumeSpecName "kube-api-access-wkhjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.287840 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6306f11-af13-4078-ad43-b00e333855b1","Type":"ContainerStarted","Data":"a367d81a67ec7dd2e518f12c456316933cf87952e1536d3d65b9d924c5973a9c"} Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.292987 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mrrgm" event={"ID":"08cd2749-20f1-4836-ac61-62b7d555a3b3","Type":"ContainerDied","Data":"a64ce5bbddf0a0c017592b26cd642a4b7265b3c0eff0941755fac2c12ba195b3"} Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.293033 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a64ce5bbddf0a0c017592b26cd642a4b7265b3c0eff0941755fac2c12ba195b3" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.293116 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mrrgm" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.296633 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerStarted","Data":"59c53874adab16a378cbc23dc91a94d10d8e3918020ccff221e50334401825fe"} Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.298142 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-85tpz" event={"ID":"6473d21f-8a15-443f-b5ac-2211e1cf0e55","Type":"ContainerDied","Data":"f094dac3c8a93e46d4bc268246c0aac1d540a2a702b826b05047aa65728ee118"} Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.298435 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f094dac3c8a93e46d4bc268246c0aac1d540a2a702b826b05047aa65728ee118" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.298510 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-85tpz" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.309763 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gt5m2" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.309813 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gt5m2" event={"ID":"5806c84d-2c8f-402d-9487-656bd2936933","Type":"ContainerDied","Data":"fb90397f9ca26addbd657051ece598417ce1d615e4557392e3305a1970222cfd"} Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.309839 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb90397f9ca26addbd657051ece598417ce1d615e4557392e3305a1970222cfd" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.365343 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpfxg\" (UniqueName: \"kubernetes.io/projected/08cd2749-20f1-4836-ac61-62b7d555a3b3-kube-api-access-kpfxg\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.365373 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkhjv\" (UniqueName: \"kubernetes.io/projected/5806c84d-2c8f-402d-9487-656bd2936933-kube-api-access-wkhjv\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:12 crc kubenswrapper[4658]: W1002 11:38:12.378547 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67f8b15f_e190_40d6_8b7b_e8ba932f00f9.slice/crio-592296ed031f1fb84f32d15e82d9972cfb24a732aa37697f75c15c9b6ee28823 WatchSource:0}: Error finding container 592296ed031f1fb84f32d15e82d9972cfb24a732aa37697f75c15c9b6ee28823: Status 404 returned error can't find the container with id 592296ed031f1fb84f32d15e82d9972cfb24a732aa37697f75c15c9b6ee28823 Oct 02 11:38:12 crc kubenswrapper[4658]: I1002 11:38:12.380670 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:38:13 crc kubenswrapper[4658]: I1002 11:38:13.333035 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67f8b15f-e190-40d6-8b7b-e8ba932f00f9","Type":"ContainerStarted","Data":"6bf20f4786d3b4e47a82c3476d25f42ba11bd51733c0df611d6b539a45a74837"} Oct 02 11:38:13 crc kubenswrapper[4658]: I1002 11:38:13.333532 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67f8b15f-e190-40d6-8b7b-e8ba932f00f9","Type":"ContainerStarted","Data":"592296ed031f1fb84f32d15e82d9972cfb24a732aa37697f75c15c9b6ee28823"} Oct 02 11:38:13 crc kubenswrapper[4658]: I1002 11:38:13.338889 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6306f11-af13-4078-ad43-b00e333855b1","Type":"ContainerStarted","Data":"5108a452c3a5d3f800b7d5c5e9e32d5ae77b234b09a6fbc7b0018e28a17c10b5"} Oct 02 11:38:13 crc kubenswrapper[4658]: I1002 11:38:13.342604 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerStarted","Data":"723b2150cd37675a1145521aa4277ed2194baeb48e7a4c342daeb2f59b4f0280"} Oct 02 11:38:13 crc kubenswrapper[4658]: I1002 11:38:13.342653 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerStarted","Data":"5d26275c8b678613e651d4085bbd3ae796d348b44d9b73c6b8bbb4b883939958"} Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.355250 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67f8b15f-e190-40d6-8b7b-e8ba932f00f9","Type":"ContainerStarted","Data":"f516923525400947ec3be511f5622870788314efd5edc7ba395c120b61fc791b"} Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.388395 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.388354596 podStartE2EDuration="5.388354596s" podCreationTimestamp="2025-10-02 11:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:38:13.366770671 +0000 UTC m=+1174.257924238" watchObservedRunningTime="2025-10-02 11:38:14.388354596 +0000 UTC m=+1175.279508163" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.389287 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.389278896 podStartE2EDuration="3.389278896s" podCreationTimestamp="2025-10-02 11:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:38:14.37708802 +0000 UTC m=+1175.268241587" watchObservedRunningTime="2025-10-02 11:38:14.389278896 +0000 UTC m=+1175.280432473" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.556234 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-20ab-account-create-swrjh"] Oct 02 11:38:14 crc kubenswrapper[4658]: E1002 11:38:14.556780 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08cd2749-20f1-4836-ac61-62b7d555a3b3" containerName="mariadb-database-create" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.556803 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="08cd2749-20f1-4836-ac61-62b7d555a3b3" containerName="mariadb-database-create" Oct 02 11:38:14 crc kubenswrapper[4658]: E1002 11:38:14.556842 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5806c84d-2c8f-402d-9487-656bd2936933" containerName="mariadb-database-create" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.556852 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="5806c84d-2c8f-402d-9487-656bd2936933" containerName="mariadb-database-create" Oct 02 11:38:14 crc kubenswrapper[4658]: E1002 11:38:14.556865 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6473d21f-8a15-443f-b5ac-2211e1cf0e55" containerName="mariadb-database-create" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.556873 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6473d21f-8a15-443f-b5ac-2211e1cf0e55" containerName="mariadb-database-create" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.558167 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="08cd2749-20f1-4836-ac61-62b7d555a3b3" containerName="mariadb-database-create" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.558210 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="5806c84d-2c8f-402d-9487-656bd2936933" containerName="mariadb-database-create" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.558224 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="6473d21f-8a15-443f-b5ac-2211e1cf0e55" containerName="mariadb-database-create" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.559154 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-20ab-account-create-swrjh" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.562159 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.568784 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-20ab-account-create-swrjh"] Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.651617 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1e6d-account-create-mg5kn"] Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.653328 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1e6d-account-create-mg5kn" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.656020 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.669503 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1e6d-account-create-mg5kn"] Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.717044 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tffg2\" (UniqueName: \"kubernetes.io/projected/9389f8bc-6161-444c-8d9c-712f0e494c99-kube-api-access-tffg2\") pod \"nova-api-20ab-account-create-swrjh\" (UID: \"9389f8bc-6161-444c-8d9c-712f0e494c99\") " pod="openstack/nova-api-20ab-account-create-swrjh" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.818696 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptdvm\" (UniqueName: \"kubernetes.io/projected/42121d24-1598-4e99-890c-9e74b7576895-kube-api-access-ptdvm\") pod \"nova-cell0-1e6d-account-create-mg5kn\" (UID: \"42121d24-1598-4e99-890c-9e74b7576895\") " pod="openstack/nova-cell0-1e6d-account-create-mg5kn" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.818855 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tffg2\" (UniqueName: \"kubernetes.io/projected/9389f8bc-6161-444c-8d9c-712f0e494c99-kube-api-access-tffg2\") pod \"nova-api-20ab-account-create-swrjh\" (UID: \"9389f8bc-6161-444c-8d9c-712f0e494c99\") " pod="openstack/nova-api-20ab-account-create-swrjh" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.850550 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tffg2\" (UniqueName: \"kubernetes.io/projected/9389f8bc-6161-444c-8d9c-712f0e494c99-kube-api-access-tffg2\") pod \"nova-api-20ab-account-create-swrjh\" (UID: \"9389f8bc-6161-444c-8d9c-712f0e494c99\") " pod="openstack/nova-api-20ab-account-create-swrjh" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.852816 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e5d5-account-create-pwtn4"] Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.854247 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e5d5-account-create-pwtn4" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.857184 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.877373 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e5d5-account-create-pwtn4"] Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.880466 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-20ab-account-create-swrjh" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.920762 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptdvm\" (UniqueName: \"kubernetes.io/projected/42121d24-1598-4e99-890c-9e74b7576895-kube-api-access-ptdvm\") pod \"nova-cell0-1e6d-account-create-mg5kn\" (UID: \"42121d24-1598-4e99-890c-9e74b7576895\") " pod="openstack/nova-cell0-1e6d-account-create-mg5kn" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.941129 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptdvm\" (UniqueName: \"kubernetes.io/projected/42121d24-1598-4e99-890c-9e74b7576895-kube-api-access-ptdvm\") pod \"nova-cell0-1e6d-account-create-mg5kn\" (UID: \"42121d24-1598-4e99-890c-9e74b7576895\") " pod="openstack/nova-cell0-1e6d-account-create-mg5kn" Oct 02 11:38:14 crc kubenswrapper[4658]: I1002 11:38:14.977653 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1e6d-account-create-mg5kn" Oct 02 11:38:15 crc kubenswrapper[4658]: I1002 11:38:15.022432 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsrtq\" (UniqueName: \"kubernetes.io/projected/990db662-1228-4b71-92fb-af4f1aad1d79-kube-api-access-lsrtq\") pod \"nova-cell1-e5d5-account-create-pwtn4\" (UID: \"990db662-1228-4b71-92fb-af4f1aad1d79\") " pod="openstack/nova-cell1-e5d5-account-create-pwtn4" Oct 02 11:38:15 crc kubenswrapper[4658]: I1002 11:38:15.123839 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsrtq\" (UniqueName: \"kubernetes.io/projected/990db662-1228-4b71-92fb-af4f1aad1d79-kube-api-access-lsrtq\") pod \"nova-cell1-e5d5-account-create-pwtn4\" (UID: \"990db662-1228-4b71-92fb-af4f1aad1d79\") " pod="openstack/nova-cell1-e5d5-account-create-pwtn4" Oct 02 11:38:15 crc kubenswrapper[4658]: I1002 11:38:15.161861 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsrtq\" (UniqueName: \"kubernetes.io/projected/990db662-1228-4b71-92fb-af4f1aad1d79-kube-api-access-lsrtq\") pod \"nova-cell1-e5d5-account-create-pwtn4\" (UID: \"990db662-1228-4b71-92fb-af4f1aad1d79\") " pod="openstack/nova-cell1-e5d5-account-create-pwtn4" Oct 02 11:38:15 crc kubenswrapper[4658]: I1002 11:38:15.207245 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e5d5-account-create-pwtn4" Oct 02 11:38:15 crc kubenswrapper[4658]: I1002 11:38:15.589586 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1e6d-account-create-mg5kn"] Oct 02 11:38:15 crc kubenswrapper[4658]: I1002 11:38:15.688723 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-20ab-account-create-swrjh"] Oct 02 11:38:15 crc kubenswrapper[4658]: W1002 11:38:15.702841 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9389f8bc_6161_444c_8d9c_712f0e494c99.slice/crio-8645575596ef55a851bfaa5f164ff66edcd99acfe3c73287a921573b3693eefc WatchSource:0}: Error finding container 8645575596ef55a851bfaa5f164ff66edcd99acfe3c73287a921573b3693eefc: Status 404 returned error can't find the container with id 8645575596ef55a851bfaa5f164ff66edcd99acfe3c73287a921573b3693eefc Oct 02 11:38:15 crc kubenswrapper[4658]: W1002 11:38:15.865136 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod990db662_1228_4b71_92fb_af4f1aad1d79.slice/crio-adf1c292ead44c7dc773f1db76839d4e905fc14e05b7f5415bb5aa06d1692bdd WatchSource:0}: Error finding container adf1c292ead44c7dc773f1db76839d4e905fc14e05b7f5415bb5aa06d1692bdd: Status 404 returned error can't find the container with id adf1c292ead44c7dc773f1db76839d4e905fc14e05b7f5415bb5aa06d1692bdd Oct 02 11:38:15 crc kubenswrapper[4658]: I1002 11:38:15.867149 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e5d5-account-create-pwtn4"] Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.376939 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1e6d-account-create-mg5kn" event={"ID":"42121d24-1598-4e99-890c-9e74b7576895","Type":"ContainerStarted","Data":"4d5bbf975658780b7139160eef6863f2cfe845587de6d69e84a669ad6324320c"} Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.378477 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e5d5-account-create-pwtn4" event={"ID":"990db662-1228-4b71-92fb-af4f1aad1d79","Type":"ContainerStarted","Data":"adf1c292ead44c7dc773f1db76839d4e905fc14e05b7f5415bb5aa06d1692bdd"} Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.380004 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-20ab-account-create-swrjh" event={"ID":"9389f8bc-6161-444c-8d9c-712f0e494c99","Type":"ContainerStarted","Data":"8645575596ef55a851bfaa5f164ff66edcd99acfe3c73287a921573b3693eefc"} Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.382761 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerStarted","Data":"8b1b34a600c51b86c374ec0b97c7476c19a8dfcd337854087996a0693c7b42eb"} Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.382882 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="ceilometer-central-agent" containerID="cri-o://59c53874adab16a378cbc23dc91a94d10d8e3918020ccff221e50334401825fe" gracePeriod=30 Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.382923 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.382947 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="sg-core" containerID="cri-o://5d26275c8b678613e651d4085bbd3ae796d348b44d9b73c6b8bbb4b883939958" gracePeriod=30 Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.382962 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="ceilometer-notification-agent" containerID="cri-o://723b2150cd37675a1145521aa4277ed2194baeb48e7a4c342daeb2f59b4f0280" gracePeriod=30 Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.383110 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="proxy-httpd" containerID="cri-o://8b1b34a600c51b86c374ec0b97c7476c19a8dfcd337854087996a0693c7b42eb" gracePeriod=30 Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.413008 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.693034781 podStartE2EDuration="7.412993002s" podCreationTimestamp="2025-10-02 11:38:09 +0000 UTC" firstStartedPulling="2025-10-02 11:38:10.315655752 +0000 UTC m=+1171.206809319" lastFinishedPulling="2025-10-02 11:38:15.035613963 +0000 UTC m=+1175.926767540" observedRunningTime="2025-10-02 11:38:16.409288242 +0000 UTC m=+1177.300441819" watchObservedRunningTime="2025-10-02 11:38:16.412993002 +0000 UTC m=+1177.304146569" Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.683611 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:16 crc kubenswrapper[4658]: I1002 11:38:16.684403 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5566488b4c-k88mg" Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.398524 4658 generic.go:334] "Generic (PLEG): container finished" podID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerID="8b1b34a600c51b86c374ec0b97c7476c19a8dfcd337854087996a0693c7b42eb" exitCode=0 Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.398857 4658 generic.go:334] "Generic (PLEG): container finished" podID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerID="5d26275c8b678613e651d4085bbd3ae796d348b44d9b73c6b8bbb4b883939958" exitCode=2 Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.398867 4658 generic.go:334] "Generic (PLEG): container finished" podID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerID="723b2150cd37675a1145521aa4277ed2194baeb48e7a4c342daeb2f59b4f0280" exitCode=0 Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.398874 4658 generic.go:334] "Generic (PLEG): container finished" podID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerID="59c53874adab16a378cbc23dc91a94d10d8e3918020ccff221e50334401825fe" exitCode=0 Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.398843 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerDied","Data":"8b1b34a600c51b86c374ec0b97c7476c19a8dfcd337854087996a0693c7b42eb"} Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.398953 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerDied","Data":"5d26275c8b678613e651d4085bbd3ae796d348b44d9b73c6b8bbb4b883939958"} Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.398971 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerDied","Data":"723b2150cd37675a1145521aa4277ed2194baeb48e7a4c342daeb2f59b4f0280"} Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.398988 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerDied","Data":"59c53874adab16a378cbc23dc91a94d10d8e3918020ccff221e50334401825fe"} Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.763908 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.898594 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-sg-core-conf-yaml\") pod \"3f015afb-5dc8-4a46-908a-2df29e61c05f\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.898651 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-config-data\") pod \"3f015afb-5dc8-4a46-908a-2df29e61c05f\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.898693 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-combined-ca-bundle\") pod \"3f015afb-5dc8-4a46-908a-2df29e61c05f\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.898833 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ff8g\" (UniqueName: \"kubernetes.io/projected/3f015afb-5dc8-4a46-908a-2df29e61c05f-kube-api-access-4ff8g\") pod \"3f015afb-5dc8-4a46-908a-2df29e61c05f\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.898880 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-log-httpd\") pod \"3f015afb-5dc8-4a46-908a-2df29e61c05f\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.898935 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-scripts\") pod \"3f015afb-5dc8-4a46-908a-2df29e61c05f\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.898967 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-run-httpd\") pod \"3f015afb-5dc8-4a46-908a-2df29e61c05f\" (UID: \"3f015afb-5dc8-4a46-908a-2df29e61c05f\") " Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.900549 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3f015afb-5dc8-4a46-908a-2df29e61c05f" (UID: "3f015afb-5dc8-4a46-908a-2df29e61c05f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.900676 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3f015afb-5dc8-4a46-908a-2df29e61c05f" (UID: "3f015afb-5dc8-4a46-908a-2df29e61c05f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.908438 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-scripts" (OuterVolumeSpecName: "scripts") pod "3f015afb-5dc8-4a46-908a-2df29e61c05f" (UID: "3f015afb-5dc8-4a46-908a-2df29e61c05f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.908875 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f015afb-5dc8-4a46-908a-2df29e61c05f-kube-api-access-4ff8g" (OuterVolumeSpecName: "kube-api-access-4ff8g") pod "3f015afb-5dc8-4a46-908a-2df29e61c05f" (UID: "3f015afb-5dc8-4a46-908a-2df29e61c05f"). InnerVolumeSpecName "kube-api-access-4ff8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:17 crc kubenswrapper[4658]: I1002 11:38:17.944359 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3f015afb-5dc8-4a46-908a-2df29e61c05f" (UID: "3f015afb-5dc8-4a46-908a-2df29e61c05f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.001517 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ff8g\" (UniqueName: \"kubernetes.io/projected/3f015afb-5dc8-4a46-908a-2df29e61c05f-kube-api-access-4ff8g\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.001549 4658 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.001557 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.001568 4658 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f015afb-5dc8-4a46-908a-2df29e61c05f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.001581 4658 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.031584 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-config-data" (OuterVolumeSpecName: "config-data") pod "3f015afb-5dc8-4a46-908a-2df29e61c05f" (UID: "3f015afb-5dc8-4a46-908a-2df29e61c05f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.045247 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f015afb-5dc8-4a46-908a-2df29e61c05f" (UID: "3f015afb-5dc8-4a46-908a-2df29e61c05f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.103771 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.103802 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f015afb-5dc8-4a46-908a-2df29e61c05f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.412495 4658 generic.go:334] "Generic (PLEG): container finished" podID="9389f8bc-6161-444c-8d9c-712f0e494c99" containerID="8c9d3ad474bcf1ab2f6607b84cffeea01b7f397c61d8c0df65a7a8b23a17aaa5" exitCode=0 Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.412570 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-20ab-account-create-swrjh" event={"ID":"9389f8bc-6161-444c-8d9c-712f0e494c99","Type":"ContainerDied","Data":"8c9d3ad474bcf1ab2f6607b84cffeea01b7f397c61d8c0df65a7a8b23a17aaa5"} Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.422900 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f015afb-5dc8-4a46-908a-2df29e61c05f","Type":"ContainerDied","Data":"880c5335760d4fc3c1520a44eaf711c59df4c066e41f295070f130daddcfb202"} Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.422950 4658 scope.go:117] "RemoveContainer" containerID="8b1b34a600c51b86c374ec0b97c7476c19a8dfcd337854087996a0693c7b42eb" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.422913 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.431865 4658 generic.go:334] "Generic (PLEG): container finished" podID="42121d24-1598-4e99-890c-9e74b7576895" containerID="c0db916e67cae1a0a9338a4d1bc79c90fb4ad51c67a1febc4d5e41742cbd2836" exitCode=0 Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.432062 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1e6d-account-create-mg5kn" event={"ID":"42121d24-1598-4e99-890c-9e74b7576895","Type":"ContainerDied","Data":"c0db916e67cae1a0a9338a4d1bc79c90fb4ad51c67a1febc4d5e41742cbd2836"} Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.443053 4658 generic.go:334] "Generic (PLEG): container finished" podID="990db662-1228-4b71-92fb-af4f1aad1d79" containerID="460e2a82155c4f23a1957b27c27d2355d7052d7b2c2469a2a205ffc0b734ee04" exitCode=0 Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.443094 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e5d5-account-create-pwtn4" event={"ID":"990db662-1228-4b71-92fb-af4f1aad1d79","Type":"ContainerDied","Data":"460e2a82155c4f23a1957b27c27d2355d7052d7b2c2469a2a205ffc0b734ee04"} Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.465156 4658 scope.go:117] "RemoveContainer" containerID="5d26275c8b678613e651d4085bbd3ae796d348b44d9b73c6b8bbb4b883939958" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.498737 4658 scope.go:117] "RemoveContainer" containerID="723b2150cd37675a1145521aa4277ed2194baeb48e7a4c342daeb2f59b4f0280" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.531106 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.539132 4658 scope.go:117] "RemoveContainer" containerID="59c53874adab16a378cbc23dc91a94d10d8e3918020ccff221e50334401825fe" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.539652 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.551198 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:18 crc kubenswrapper[4658]: E1002 11:38:18.551712 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="ceilometer-notification-agent" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.551734 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="ceilometer-notification-agent" Oct 02 11:38:18 crc kubenswrapper[4658]: E1002 11:38:18.551757 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="ceilometer-central-agent" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.551764 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="ceilometer-central-agent" Oct 02 11:38:18 crc kubenswrapper[4658]: E1002 11:38:18.551791 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="proxy-httpd" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.551800 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="proxy-httpd" Oct 02 11:38:18 crc kubenswrapper[4658]: E1002 11:38:18.551821 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="sg-core" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.551829 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="sg-core" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.552446 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="ceilometer-central-agent" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.552493 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="sg-core" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.552509 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="ceilometer-notification-agent" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.552529 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" containerName="proxy-httpd" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.554795 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.558208 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.558885 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.559238 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.725590 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-log-httpd\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.725710 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-scripts\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.725736 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/4af3e0cc-30fb-40d9-b258-7221528249cb-kube-api-access-8mwwb\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.725830 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-config-data\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.725871 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-run-httpd\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.725889 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.725906 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.827693 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-config-data\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.828895 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-run-httpd\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.829222 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.829449 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.829607 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-log-httpd\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.829479 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-run-httpd\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.830034 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-scripts\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.830188 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/4af3e0cc-30fb-40d9-b258-7221528249cb-kube-api-access-8mwwb\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.830255 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-log-httpd\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.834743 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-config-data\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.834910 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-scripts\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.835548 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.837734 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.856508 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/4af3e0cc-30fb-40d9-b258-7221528249cb-kube-api-access-8mwwb\") pod \"ceilometer-0\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " pod="openstack/ceilometer-0" Oct 02 11:38:18 crc kubenswrapper[4658]: I1002 11:38:18.880656 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:19 crc kubenswrapper[4658]: W1002 11:38:19.367996 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4af3e0cc_30fb_40d9_b258_7221528249cb.slice/crio-9f83add7f8d29d91248a8d2bdbc504bf6f7d19dd540ebd735fca1332ded955fe WatchSource:0}: Error finding container 9f83add7f8d29d91248a8d2bdbc504bf6f7d19dd540ebd735fca1332ded955fe: Status 404 returned error can't find the container with id 9f83add7f8d29d91248a8d2bdbc504bf6f7d19dd540ebd735fca1332ded955fe Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.369771 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.457423 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerStarted","Data":"9f83add7f8d29d91248a8d2bdbc504bf6f7d19dd540ebd735fca1332ded955fe"} Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.543207 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.543260 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.545238 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.597436 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.597612 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.705317 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.707722 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.750862 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:19 crc kubenswrapper[4658]: I1002 11:38:19.762392 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.019544 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f015afb-5dc8-4a46-908a-2df29e61c05f" path="/var/lib/kubelet/pods/3f015afb-5dc8-4a46-908a-2df29e61c05f/volumes" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.212388 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-20ab-account-create-swrjh" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.228547 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1e6d-account-create-mg5kn" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.255626 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e5d5-account-create-pwtn4" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.361926 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptdvm\" (UniqueName: \"kubernetes.io/projected/42121d24-1598-4e99-890c-9e74b7576895-kube-api-access-ptdvm\") pod \"42121d24-1598-4e99-890c-9e74b7576895\" (UID: \"42121d24-1598-4e99-890c-9e74b7576895\") " Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.362289 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tffg2\" (UniqueName: \"kubernetes.io/projected/9389f8bc-6161-444c-8d9c-712f0e494c99-kube-api-access-tffg2\") pod \"9389f8bc-6161-444c-8d9c-712f0e494c99\" (UID: \"9389f8bc-6161-444c-8d9c-712f0e494c99\") " Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.362568 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsrtq\" (UniqueName: \"kubernetes.io/projected/990db662-1228-4b71-92fb-af4f1aad1d79-kube-api-access-lsrtq\") pod \"990db662-1228-4b71-92fb-af4f1aad1d79\" (UID: \"990db662-1228-4b71-92fb-af4f1aad1d79\") " Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.368853 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/990db662-1228-4b71-92fb-af4f1aad1d79-kube-api-access-lsrtq" (OuterVolumeSpecName: "kube-api-access-lsrtq") pod "990db662-1228-4b71-92fb-af4f1aad1d79" (UID: "990db662-1228-4b71-92fb-af4f1aad1d79"). InnerVolumeSpecName "kube-api-access-lsrtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.370859 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42121d24-1598-4e99-890c-9e74b7576895-kube-api-access-ptdvm" (OuterVolumeSpecName: "kube-api-access-ptdvm") pod "42121d24-1598-4e99-890c-9e74b7576895" (UID: "42121d24-1598-4e99-890c-9e74b7576895"). InnerVolumeSpecName "kube-api-access-ptdvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.371033 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9389f8bc-6161-444c-8d9c-712f0e494c99-kube-api-access-tffg2" (OuterVolumeSpecName: "kube-api-access-tffg2") pod "9389f8bc-6161-444c-8d9c-712f0e494c99" (UID: "9389f8bc-6161-444c-8d9c-712f0e494c99"). InnerVolumeSpecName "kube-api-access-tffg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.466150 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsrtq\" (UniqueName: \"kubernetes.io/projected/990db662-1228-4b71-92fb-af4f1aad1d79-kube-api-access-lsrtq\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.466179 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptdvm\" (UniqueName: \"kubernetes.io/projected/42121d24-1598-4e99-890c-9e74b7576895-kube-api-access-ptdvm\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.466209 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tffg2\" (UniqueName: \"kubernetes.io/projected/9389f8bc-6161-444c-8d9c-712f0e494c99-kube-api-access-tffg2\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.468510 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-20ab-account-create-swrjh" event={"ID":"9389f8bc-6161-444c-8d9c-712f0e494c99","Type":"ContainerDied","Data":"8645575596ef55a851bfaa5f164ff66edcd99acfe3c73287a921573b3693eefc"} Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.468623 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8645575596ef55a851bfaa5f164ff66edcd99acfe3c73287a921573b3693eefc" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.468747 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-20ab-account-create-swrjh" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.481215 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerStarted","Data":"2c46eee6a32bc462cf7745e9aa191f4f6d7330b5409e4205d596256ebd36bbf6"} Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.487882 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e5d5-account-create-pwtn4" event={"ID":"990db662-1228-4b71-92fb-af4f1aad1d79","Type":"ContainerDied","Data":"adf1c292ead44c7dc773f1db76839d4e905fc14e05b7f5415bb5aa06d1692bdd"} Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.487921 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf1c292ead44c7dc773f1db76839d4e905fc14e05b7f5415bb5aa06d1692bdd" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.487994 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e5d5-account-create-pwtn4" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.502727 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1e6d-account-create-mg5kn" event={"ID":"42121d24-1598-4e99-890c-9e74b7576895","Type":"ContainerDied","Data":"4d5bbf975658780b7139160eef6863f2cfe845587de6d69e84a669ad6324320c"} Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.502836 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d5bbf975658780b7139160eef6863f2cfe845587de6d69e84a669ad6324320c" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.502956 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1e6d-account-create-mg5kn" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.503891 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:20 crc kubenswrapper[4658]: I1002 11:38:20.503926 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:21 crc kubenswrapper[4658]: I1002 11:38:21.515044 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerStarted","Data":"e04e2c9d793ea0c3352448a559d54c1c2756e18120441090020267401b61ddf9"} Oct 02 11:38:21 crc kubenswrapper[4658]: I1002 11:38:21.727459 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:38:21 crc kubenswrapper[4658]: I1002 11:38:21.727520 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:38:21 crc kubenswrapper[4658]: I1002 11:38:21.768789 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:38:21 crc kubenswrapper[4658]: I1002 11:38:21.778612 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:38:22 crc kubenswrapper[4658]: I1002 11:38:22.410536 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:22 crc kubenswrapper[4658]: I1002 11:38:22.536235 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:38:22 crc kubenswrapper[4658]: I1002 11:38:22.536263 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:38:22 crc kubenswrapper[4658]: I1002 11:38:22.537322 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerStarted","Data":"40405e91464c3ac9e79d6978a3bc0dbba8682ee5395abb75cbb3d42fe7757848"} Oct 02 11:38:22 crc kubenswrapper[4658]: I1002 11:38:22.538423 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:38:22 crc kubenswrapper[4658]: I1002 11:38:22.538450 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:38:23 crc kubenswrapper[4658]: I1002 11:38:23.432633 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:23 crc kubenswrapper[4658]: I1002 11:38:23.440667 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.559162 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerStarted","Data":"bbb5e8b814ea804b3841de3f2b11ef4e076af6428f21ccd446c18bd3d552b80d"} Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.559201 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.559214 4658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.559326 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="sg-core" containerID="cri-o://40405e91464c3ac9e79d6978a3bc0dbba8682ee5395abb75cbb3d42fe7757848" gracePeriod=30 Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.559358 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.559355 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="ceilometer-central-agent" containerID="cri-o://2c46eee6a32bc462cf7745e9aa191f4f6d7330b5409e4205d596256ebd36bbf6" gracePeriod=30 Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.559401 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="ceilometer-notification-agent" containerID="cri-o://e04e2c9d793ea0c3352448a559d54c1c2756e18120441090020267401b61ddf9" gracePeriod=30 Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.559326 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="proxy-httpd" containerID="cri-o://bbb5e8b814ea804b3841de3f2b11ef4e076af6428f21ccd446c18bd3d552b80d" gracePeriod=30 Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.593006 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.189053694 podStartE2EDuration="6.592989044s" podCreationTimestamp="2025-10-02 11:38:18 +0000 UTC" firstStartedPulling="2025-10-02 11:38:19.37082078 +0000 UTC m=+1180.261974347" lastFinishedPulling="2025-10-02 11:38:23.77475613 +0000 UTC m=+1184.665909697" observedRunningTime="2025-10-02 11:38:24.587802826 +0000 UTC m=+1185.478956423" watchObservedRunningTime="2025-10-02 11:38:24.592989044 +0000 UTC m=+1185.484142611" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.974638 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-55974"] Oct 02 11:38:24 crc kubenswrapper[4658]: E1002 11:38:24.975034 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="990db662-1228-4b71-92fb-af4f1aad1d79" containerName="mariadb-account-create" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.975050 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="990db662-1228-4b71-92fb-af4f1aad1d79" containerName="mariadb-account-create" Oct 02 11:38:24 crc kubenswrapper[4658]: E1002 11:38:24.975070 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42121d24-1598-4e99-890c-9e74b7576895" containerName="mariadb-account-create" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.975079 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="42121d24-1598-4e99-890c-9e74b7576895" containerName="mariadb-account-create" Oct 02 11:38:24 crc kubenswrapper[4658]: E1002 11:38:24.975093 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9389f8bc-6161-444c-8d9c-712f0e494c99" containerName="mariadb-account-create" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.975099 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="9389f8bc-6161-444c-8d9c-712f0e494c99" containerName="mariadb-account-create" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.975282 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="990db662-1228-4b71-92fb-af4f1aad1d79" containerName="mariadb-account-create" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.975322 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="42121d24-1598-4e99-890c-9e74b7576895" containerName="mariadb-account-create" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.975337 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="9389f8bc-6161-444c-8d9c-712f0e494c99" containerName="mariadb-account-create" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.975935 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.978862 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.980573 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-k4sqh" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.989858 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 02 11:38:24 crc kubenswrapper[4658]: I1002 11:38:24.992884 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-55974"] Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.072122 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.072163 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-scripts\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.072186 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-config-data\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.072220 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxczs\" (UniqueName: \"kubernetes.io/projected/f4e2ba1e-bb1f-4770-a261-979b3f467bce-kube-api-access-bxczs\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.173829 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxczs\" (UniqueName: \"kubernetes.io/projected/f4e2ba1e-bb1f-4770-a261-979b3f467bce-kube-api-access-bxczs\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.174064 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.174093 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-scripts\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.174109 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-config-data\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.179872 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-scripts\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.181038 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-config-data\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.194982 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxczs\" (UniqueName: \"kubernetes.io/projected/f4e2ba1e-bb1f-4770-a261-979b3f467bce-kube-api-access-bxczs\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.198061 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-55974\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.293871 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.336779 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.341117 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.594322 4658 generic.go:334] "Generic (PLEG): container finished" podID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerID="bbb5e8b814ea804b3841de3f2b11ef4e076af6428f21ccd446c18bd3d552b80d" exitCode=0 Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.594584 4658 generic.go:334] "Generic (PLEG): container finished" podID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerID="40405e91464c3ac9e79d6978a3bc0dbba8682ee5395abb75cbb3d42fe7757848" exitCode=2 Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.594593 4658 generic.go:334] "Generic (PLEG): container finished" podID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerID="e04e2c9d793ea0c3352448a559d54c1c2756e18120441090020267401b61ddf9" exitCode=0 Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.597393 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerDied","Data":"bbb5e8b814ea804b3841de3f2b11ef4e076af6428f21ccd446c18bd3d552b80d"} Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.597447 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerDied","Data":"40405e91464c3ac9e79d6978a3bc0dbba8682ee5395abb75cbb3d42fe7757848"} Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.597459 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerDied","Data":"e04e2c9d793ea0c3352448a559d54c1c2756e18120441090020267401b61ddf9"} Oct 02 11:38:25 crc kubenswrapper[4658]: I1002 11:38:25.876675 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-55974"] Oct 02 11:38:26 crc kubenswrapper[4658]: I1002 11:38:26.616839 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-55974" event={"ID":"f4e2ba1e-bb1f-4770-a261-979b3f467bce","Type":"ContainerStarted","Data":"ea691c7ed8e6a9193abbd12ff0c0bd4fa21833deca6be248255600315c7f9d24"} Oct 02 11:38:27 crc kubenswrapper[4658]: I1002 11:38:27.429639 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:38:27 crc kubenswrapper[4658]: I1002 11:38:27.429711 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:38:27 crc kubenswrapper[4658]: I1002 11:38:27.429766 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:38:27 crc kubenswrapper[4658]: I1002 11:38:27.430384 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"070d9ca89b2be9f5cb302e4464d452f6af7427a486ef0fedb26718058c812952"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:38:27 crc kubenswrapper[4658]: I1002 11:38:27.430458 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://070d9ca89b2be9f5cb302e4464d452f6af7427a486ef0fedb26718058c812952" gracePeriod=600 Oct 02 11:38:27 crc kubenswrapper[4658]: I1002 11:38:27.631494 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="070d9ca89b2be9f5cb302e4464d452f6af7427a486ef0fedb26718058c812952" exitCode=0 Oct 02 11:38:27 crc kubenswrapper[4658]: I1002 11:38:27.631779 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"070d9ca89b2be9f5cb302e4464d452f6af7427a486ef0fedb26718058c812952"} Oct 02 11:38:27 crc kubenswrapper[4658]: I1002 11:38:27.631812 4658 scope.go:117] "RemoveContainer" containerID="d11d8049b244ab8835831d1427eb5be75c611efce4e7cb5b809ccc2a5ccfd02a" Oct 02 11:38:28 crc kubenswrapper[4658]: I1002 11:38:28.650710 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"291f0b40b657899a41b0a5366c5b61d4ebf6b86816e301bb8cd5cf300e7b2e11"} Oct 02 11:38:29 crc kubenswrapper[4658]: I1002 11:38:29.544771 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Oct 02 11:38:29 crc kubenswrapper[4658]: I1002 11:38:29.597651 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-776f4bfd7b-cm7vj" podUID="02408c48-14d8-4a7b-8ebf-79fd2fa1b924" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Oct 02 11:38:30 crc kubenswrapper[4658]: I1002 11:38:30.672710 4658 generic.go:334] "Generic (PLEG): container finished" podID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerID="2c46eee6a32bc462cf7745e9aa191f4f6d7330b5409e4205d596256ebd36bbf6" exitCode=0 Oct 02 11:38:30 crc kubenswrapper[4658]: I1002 11:38:30.672787 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerDied","Data":"2c46eee6a32bc462cf7745e9aa191f4f6d7330b5409e4205d596256ebd36bbf6"} Oct 02 11:38:33 crc kubenswrapper[4658]: I1002 11:38:33.941361 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.062379 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-run-httpd\") pod \"4af3e0cc-30fb-40d9-b258-7221528249cb\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.062629 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-scripts\") pod \"4af3e0cc-30fb-40d9-b258-7221528249cb\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.062768 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-log-httpd\") pod \"4af3e0cc-30fb-40d9-b258-7221528249cb\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.062907 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-sg-core-conf-yaml\") pod \"4af3e0cc-30fb-40d9-b258-7221528249cb\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.062982 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-combined-ca-bundle\") pod \"4af3e0cc-30fb-40d9-b258-7221528249cb\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.063053 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/4af3e0cc-30fb-40d9-b258-7221528249cb-kube-api-access-8mwwb\") pod \"4af3e0cc-30fb-40d9-b258-7221528249cb\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.062903 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4af3e0cc-30fb-40d9-b258-7221528249cb" (UID: "4af3e0cc-30fb-40d9-b258-7221528249cb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.063119 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4af3e0cc-30fb-40d9-b258-7221528249cb" (UID: "4af3e0cc-30fb-40d9-b258-7221528249cb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.063245 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-config-data\") pod \"4af3e0cc-30fb-40d9-b258-7221528249cb\" (UID: \"4af3e0cc-30fb-40d9-b258-7221528249cb\") " Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.063790 4658 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.063911 4658 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3e0cc-30fb-40d9-b258-7221528249cb-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.067033 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af3e0cc-30fb-40d9-b258-7221528249cb-kube-api-access-8mwwb" (OuterVolumeSpecName: "kube-api-access-8mwwb") pod "4af3e0cc-30fb-40d9-b258-7221528249cb" (UID: "4af3e0cc-30fb-40d9-b258-7221528249cb"). InnerVolumeSpecName "kube-api-access-8mwwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.067433 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-scripts" (OuterVolumeSpecName: "scripts") pod "4af3e0cc-30fb-40d9-b258-7221528249cb" (UID: "4af3e0cc-30fb-40d9-b258-7221528249cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.092002 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4af3e0cc-30fb-40d9-b258-7221528249cb" (UID: "4af3e0cc-30fb-40d9-b258-7221528249cb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.140056 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4af3e0cc-30fb-40d9-b258-7221528249cb" (UID: "4af3e0cc-30fb-40d9-b258-7221528249cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.157955 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-config-data" (OuterVolumeSpecName: "config-data") pod "4af3e0cc-30fb-40d9-b258-7221528249cb" (UID: "4af3e0cc-30fb-40d9-b258-7221528249cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.166342 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.166382 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.166391 4658 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.166402 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3e0cc-30fb-40d9-b258-7221528249cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.166411 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/4af3e0cc-30fb-40d9-b258-7221528249cb-kube-api-access-8mwwb\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.720365 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3e0cc-30fb-40d9-b258-7221528249cb","Type":"ContainerDied","Data":"9f83add7f8d29d91248a8d2bdbc504bf6f7d19dd540ebd735fca1332ded955fe"} Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.720434 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.720470 4658 scope.go:117] "RemoveContainer" containerID="bbb5e8b814ea804b3841de3f2b11ef4e076af6428f21ccd446c18bd3d552b80d" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.722995 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-55974" event={"ID":"f4e2ba1e-bb1f-4770-a261-979b3f467bce","Type":"ContainerStarted","Data":"68e4a88a6165fdf99aca95846caaccdd2d173ccb2aaa09e4dc4623a9c1a01c17"} Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.749963 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-55974" podStartSLOduration=2.986706716 podStartE2EDuration="10.749944962s" podCreationTimestamp="2025-10-02 11:38:24 +0000 UTC" firstStartedPulling="2025-10-02 11:38:25.882434755 +0000 UTC m=+1186.773588322" lastFinishedPulling="2025-10-02 11:38:33.645673001 +0000 UTC m=+1194.536826568" observedRunningTime="2025-10-02 11:38:34.741215789 +0000 UTC m=+1195.632369356" watchObservedRunningTime="2025-10-02 11:38:34.749944962 +0000 UTC m=+1195.641098529" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.770716 4658 scope.go:117] "RemoveContainer" containerID="40405e91464c3ac9e79d6978a3bc0dbba8682ee5395abb75cbb3d42fe7757848" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.784928 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.803866 4658 scope.go:117] "RemoveContainer" containerID="e04e2c9d793ea0c3352448a559d54c1c2756e18120441090020267401b61ddf9" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.817444 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.831849 4658 scope.go:117] "RemoveContainer" containerID="2c46eee6a32bc462cf7745e9aa191f4f6d7330b5409e4205d596256ebd36bbf6" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.837494 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:34 crc kubenswrapper[4658]: E1002 11:38:34.838468 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="ceilometer-central-agent" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.838500 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="ceilometer-central-agent" Oct 02 11:38:34 crc kubenswrapper[4658]: E1002 11:38:34.838521 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="ceilometer-notification-agent" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.838531 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="ceilometer-notification-agent" Oct 02 11:38:34 crc kubenswrapper[4658]: E1002 11:38:34.838559 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="sg-core" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.838568 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="sg-core" Oct 02 11:38:34 crc kubenswrapper[4658]: E1002 11:38:34.838603 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="proxy-httpd" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.838613 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="proxy-httpd" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.839001 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="sg-core" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.839026 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="ceilometer-central-agent" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.839043 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="proxy-httpd" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.839060 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" containerName="ceilometer-notification-agent" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.843881 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.849029 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.849162 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.849242 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.880063 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-scripts\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.880267 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-log-httpd\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.880481 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.880524 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.880642 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-config-data\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.880707 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vprp\" (UniqueName: \"kubernetes.io/projected/ae8abc1d-5289-4f1c-bd28-001f75d735c9-kube-api-access-5vprp\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.880744 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-run-httpd\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.981861 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-config-data\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.981922 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vprp\" (UniqueName: \"kubernetes.io/projected/ae8abc1d-5289-4f1c-bd28-001f75d735c9-kube-api-access-5vprp\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.981946 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-run-httpd\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.981984 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-scripts\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.982041 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-log-httpd\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.982115 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.982138 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.982785 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-run-httpd\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.983054 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-log-httpd\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.986110 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-scripts\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:34 crc kubenswrapper[4658]: I1002 11:38:34.989701 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:35 crc kubenswrapper[4658]: I1002 11:38:35.001132 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:35 crc kubenswrapper[4658]: I1002 11:38:35.001382 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-config-data\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:35 crc kubenswrapper[4658]: I1002 11:38:35.005259 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vprp\" (UniqueName: \"kubernetes.io/projected/ae8abc1d-5289-4f1c-bd28-001f75d735c9-kube-api-access-5vprp\") pod \"ceilometer-0\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " pod="openstack/ceilometer-0" Oct 02 11:38:35 crc kubenswrapper[4658]: I1002 11:38:35.169655 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:35 crc kubenswrapper[4658]: I1002 11:38:35.640177 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:35 crc kubenswrapper[4658]: I1002 11:38:35.734096 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerStarted","Data":"bc308b26bc676ebc2e117b0c62ede2880d1b25ab0e9e8353dbb05235fae9242d"} Oct 02 11:38:35 crc kubenswrapper[4658]: I1002 11:38:35.964369 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af3e0cc-30fb-40d9-b258-7221528249cb" path="/var/lib/kubelet/pods/4af3e0cc-30fb-40d9-b258-7221528249cb/volumes" Oct 02 11:38:36 crc kubenswrapper[4658]: I1002 11:38:36.747353 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerStarted","Data":"5b1de94224d29cbb163d69c906789e6814af80e09633086a3a831d2f6b308575"} Oct 02 11:38:38 crc kubenswrapper[4658]: I1002 11:38:38.774868 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerStarted","Data":"06c9eb5b806519c04e75ecc7272ff46bf132ae70b39e9d919937bab7947edf5b"} Oct 02 11:38:38 crc kubenswrapper[4658]: I1002 11:38:38.775430 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerStarted","Data":"a2c2377984ebc2cc89af9f7e7eb1cb1d72b7675a0006dbbe6169018ee2f6dd67"} Oct 02 11:38:40 crc kubenswrapper[4658]: I1002 11:38:40.800151 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerStarted","Data":"d812e7fd90a1fafd8db84214e9acb2c4441dfa1110bec0890d0f08cd357cb997"} Oct 02 11:38:40 crc kubenswrapper[4658]: I1002 11:38:40.800760 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:38:40 crc kubenswrapper[4658]: I1002 11:38:40.830162 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.573135882 podStartE2EDuration="6.830136486s" podCreationTimestamp="2025-10-02 11:38:34 +0000 UTC" firstStartedPulling="2025-10-02 11:38:35.641136758 +0000 UTC m=+1196.532290315" lastFinishedPulling="2025-10-02 11:38:39.898137352 +0000 UTC m=+1200.789290919" observedRunningTime="2025-10-02 11:38:40.821400371 +0000 UTC m=+1201.712553948" watchObservedRunningTime="2025-10-02 11:38:40.830136486 +0000 UTC m=+1201.721290053" Oct 02 11:38:42 crc kubenswrapper[4658]: I1002 11:38:42.354153 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:38:42 crc kubenswrapper[4658]: I1002 11:38:42.369140 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:38:42 crc kubenswrapper[4658]: I1002 11:38:42.558095 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:42 crc kubenswrapper[4658]: I1002 11:38:42.819433 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="proxy-httpd" containerID="cri-o://d812e7fd90a1fafd8db84214e9acb2c4441dfa1110bec0890d0f08cd357cb997" gracePeriod=30 Oct 02 11:38:42 crc kubenswrapper[4658]: I1002 11:38:42.819461 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="sg-core" containerID="cri-o://06c9eb5b806519c04e75ecc7272ff46bf132ae70b39e9d919937bab7947edf5b" gracePeriod=30 Oct 02 11:38:42 crc kubenswrapper[4658]: I1002 11:38:42.819501 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="ceilometer-notification-agent" containerID="cri-o://a2c2377984ebc2cc89af9f7e7eb1cb1d72b7675a0006dbbe6169018ee2f6dd67" gracePeriod=30 Oct 02 11:38:42 crc kubenswrapper[4658]: I1002 11:38:42.819417 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="ceilometer-central-agent" containerID="cri-o://5b1de94224d29cbb163d69c906789e6814af80e09633086a3a831d2f6b308575" gracePeriod=30 Oct 02 11:38:43 crc kubenswrapper[4658]: I1002 11:38:43.836898 4658 generic.go:334] "Generic (PLEG): container finished" podID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerID="d812e7fd90a1fafd8db84214e9acb2c4441dfa1110bec0890d0f08cd357cb997" exitCode=0 Oct 02 11:38:43 crc kubenswrapper[4658]: I1002 11:38:43.837219 4658 generic.go:334] "Generic (PLEG): container finished" podID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerID="06c9eb5b806519c04e75ecc7272ff46bf132ae70b39e9d919937bab7947edf5b" exitCode=2 Oct 02 11:38:43 crc kubenswrapper[4658]: I1002 11:38:43.837229 4658 generic.go:334] "Generic (PLEG): container finished" podID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerID="a2c2377984ebc2cc89af9f7e7eb1cb1d72b7675a0006dbbe6169018ee2f6dd67" exitCode=0 Oct 02 11:38:43 crc kubenswrapper[4658]: I1002 11:38:43.837241 4658 generic.go:334] "Generic (PLEG): container finished" podID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerID="5b1de94224d29cbb163d69c906789e6814af80e09633086a3a831d2f6b308575" exitCode=0 Oct 02 11:38:43 crc kubenswrapper[4658]: I1002 11:38:43.836975 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerDied","Data":"d812e7fd90a1fafd8db84214e9acb2c4441dfa1110bec0890d0f08cd357cb997"} Oct 02 11:38:43 crc kubenswrapper[4658]: I1002 11:38:43.837288 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerDied","Data":"06c9eb5b806519c04e75ecc7272ff46bf132ae70b39e9d919937bab7947edf5b"} Oct 02 11:38:43 crc kubenswrapper[4658]: I1002 11:38:43.837349 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerDied","Data":"a2c2377984ebc2cc89af9f7e7eb1cb1d72b7675a0006dbbe6169018ee2f6dd67"} Oct 02 11:38:43 crc kubenswrapper[4658]: I1002 11:38:43.837360 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerDied","Data":"5b1de94224d29cbb163d69c906789e6814af80e09633086a3a831d2f6b308575"} Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.164973 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.297207 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-config-data\") pod \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.297245 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-run-httpd\") pod \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.297785 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ae8abc1d-5289-4f1c-bd28-001f75d735c9" (UID: "ae8abc1d-5289-4f1c-bd28-001f75d735c9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.298013 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-combined-ca-bundle\") pod \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.298195 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-scripts\") pod \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.298221 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-log-httpd\") pod \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.298259 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vprp\" (UniqueName: \"kubernetes.io/projected/ae8abc1d-5289-4f1c-bd28-001f75d735c9-kube-api-access-5vprp\") pod \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.298289 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-sg-core-conf-yaml\") pod \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\" (UID: \"ae8abc1d-5289-4f1c-bd28-001f75d735c9\") " Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.298961 4658 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.300384 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ae8abc1d-5289-4f1c-bd28-001f75d735c9" (UID: "ae8abc1d-5289-4f1c-bd28-001f75d735c9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.303415 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-scripts" (OuterVolumeSpecName: "scripts") pod "ae8abc1d-5289-4f1c-bd28-001f75d735c9" (UID: "ae8abc1d-5289-4f1c-bd28-001f75d735c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.317669 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae8abc1d-5289-4f1c-bd28-001f75d735c9-kube-api-access-5vprp" (OuterVolumeSpecName: "kube-api-access-5vprp") pod "ae8abc1d-5289-4f1c-bd28-001f75d735c9" (UID: "ae8abc1d-5289-4f1c-bd28-001f75d735c9"). InnerVolumeSpecName "kube-api-access-5vprp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.351500 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ae8abc1d-5289-4f1c-bd28-001f75d735c9" (UID: "ae8abc1d-5289-4f1c-bd28-001f75d735c9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.401418 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.401448 4658 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8abc1d-5289-4f1c-bd28-001f75d735c9-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.401458 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vprp\" (UniqueName: \"kubernetes.io/projected/ae8abc1d-5289-4f1c-bd28-001f75d735c9-kube-api-access-5vprp\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.401468 4658 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.412436 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae8abc1d-5289-4f1c-bd28-001f75d735c9" (UID: "ae8abc1d-5289-4f1c-bd28-001f75d735c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.432444 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-config-data" (OuterVolumeSpecName: "config-data") pod "ae8abc1d-5289-4f1c-bd28-001f75d735c9" (UID: "ae8abc1d-5289-4f1c-bd28-001f75d735c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.503606 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.503645 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8abc1d-5289-4f1c-bd28-001f75d735c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.636561 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-776f4bfd7b-cm7vj" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.718359 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6dbf7b8b8b-kj6xr"] Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.718559 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon-log" containerID="cri-o://aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd" gracePeriod=30 Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.718961 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" containerID="cri-o://19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2" gracePeriod=30 Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.731985 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.848863 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae8abc1d-5289-4f1c-bd28-001f75d735c9","Type":"ContainerDied","Data":"bc308b26bc676ebc2e117b0c62ede2880d1b25ab0e9e8353dbb05235fae9242d"} Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.848924 4658 scope.go:117] "RemoveContainer" containerID="d812e7fd90a1fafd8db84214e9acb2c4441dfa1110bec0890d0f08cd357cb997" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.849569 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.873035 4658 scope.go:117] "RemoveContainer" containerID="06c9eb5b806519c04e75ecc7272ff46bf132ae70b39e9d919937bab7947edf5b" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.891701 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.901133 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.909461 4658 scope.go:117] "RemoveContainer" containerID="a2c2377984ebc2cc89af9f7e7eb1cb1d72b7675a0006dbbe6169018ee2f6dd67" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.923153 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:44 crc kubenswrapper[4658]: E1002 11:38:44.924220 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="ceilometer-notification-agent" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.924239 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="ceilometer-notification-agent" Oct 02 11:38:44 crc kubenswrapper[4658]: E1002 11:38:44.924269 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="proxy-httpd" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.924276 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="proxy-httpd" Oct 02 11:38:44 crc kubenswrapper[4658]: E1002 11:38:44.924284 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="ceilometer-central-agent" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.924305 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="ceilometer-central-agent" Oct 02 11:38:44 crc kubenswrapper[4658]: E1002 11:38:44.924328 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="sg-core" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.924335 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="sg-core" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.924714 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="ceilometer-central-agent" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.924760 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="proxy-httpd" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.924790 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="sg-core" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.924813 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" containerName="ceilometer-notification-agent" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.948544 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.966654 4658 scope.go:117] "RemoveContainer" containerID="5b1de94224d29cbb163d69c906789e6814af80e09633086a3a831d2f6b308575" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.970061 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.972697 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:38:44 crc kubenswrapper[4658]: I1002 11:38:44.972974 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.015411 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-scripts\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.015923 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.015955 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.016010 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-config-data\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.016064 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-log-httpd\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.016151 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-run-httpd\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.016242 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wnqc\" (UniqueName: \"kubernetes.io/projected/481baa7a-7d97-44dd-b038-47a2969e3124-kube-api-access-2wnqc\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.118654 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-scripts\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.118721 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.118750 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.118789 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-config-data\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.118830 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-log-httpd\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.118889 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-run-httpd\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.118948 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wnqc\" (UniqueName: \"kubernetes.io/projected/481baa7a-7d97-44dd-b038-47a2969e3124-kube-api-access-2wnqc\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.119707 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-log-httpd\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.125125 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-run-httpd\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.125750 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-scripts\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.128411 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.128609 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.128920 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-config-data\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.136971 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wnqc\" (UniqueName: \"kubernetes.io/projected/481baa7a-7d97-44dd-b038-47a2969e3124-kube-api-access-2wnqc\") pod \"ceilometer-0\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.300356 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.602811 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:38:45 crc kubenswrapper[4658]: W1002 11:38:45.609435 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481baa7a_7d97_44dd_b038_47a2969e3124.slice/crio-a9dfb109de6163f92ba69046ccc22af7123ef0aaea67064b19b2d8de7e173ef9 WatchSource:0}: Error finding container a9dfb109de6163f92ba69046ccc22af7123ef0aaea67064b19b2d8de7e173ef9: Status 404 returned error can't find the container with id a9dfb109de6163f92ba69046ccc22af7123ef0aaea67064b19b2d8de7e173ef9 Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.861690 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerStarted","Data":"a9dfb109de6163f92ba69046ccc22af7123ef0aaea67064b19b2d8de7e173ef9"} Oct 02 11:38:45 crc kubenswrapper[4658]: I1002 11:38:45.961348 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae8abc1d-5289-4f1c-bd28-001f75d735c9" path="/var/lib/kubelet/pods/ae8abc1d-5289-4f1c-bd28-001f75d735c9/volumes" Oct 02 11:38:46 crc kubenswrapper[4658]: I1002 11:38:46.872929 4658 generic.go:334] "Generic (PLEG): container finished" podID="f4e2ba1e-bb1f-4770-a261-979b3f467bce" containerID="68e4a88a6165fdf99aca95846caaccdd2d173ccb2aaa09e4dc4623a9c1a01c17" exitCode=0 Oct 02 11:38:46 crc kubenswrapper[4658]: I1002 11:38:46.873003 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-55974" event={"ID":"f4e2ba1e-bb1f-4770-a261-979b3f467bce","Type":"ContainerDied","Data":"68e4a88a6165fdf99aca95846caaccdd2d173ccb2aaa09e4dc4623a9c1a01c17"} Oct 02 11:38:46 crc kubenswrapper[4658]: I1002 11:38:46.875692 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerStarted","Data":"3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af"} Oct 02 11:38:47 crc kubenswrapper[4658]: I1002 11:38:47.854172 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:60752->10.217.0.156:8443: read: connection reset by peer" Oct 02 11:38:47 crc kubenswrapper[4658]: I1002 11:38:47.887783 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerStarted","Data":"52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9"} Oct 02 11:38:47 crc kubenswrapper[4658]: I1002 11:38:47.889907 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerStarted","Data":"335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a"} Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.257413 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.372747 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-combined-ca-bundle\") pod \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.372815 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-config-data\") pod \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.372864 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-scripts\") pod \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.373072 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxczs\" (UniqueName: \"kubernetes.io/projected/f4e2ba1e-bb1f-4770-a261-979b3f467bce-kube-api-access-bxczs\") pod \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\" (UID: \"f4e2ba1e-bb1f-4770-a261-979b3f467bce\") " Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.378952 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e2ba1e-bb1f-4770-a261-979b3f467bce-kube-api-access-bxczs" (OuterVolumeSpecName: "kube-api-access-bxczs") pod "f4e2ba1e-bb1f-4770-a261-979b3f467bce" (UID: "f4e2ba1e-bb1f-4770-a261-979b3f467bce"). InnerVolumeSpecName "kube-api-access-bxczs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.382619 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-scripts" (OuterVolumeSpecName: "scripts") pod "f4e2ba1e-bb1f-4770-a261-979b3f467bce" (UID: "f4e2ba1e-bb1f-4770-a261-979b3f467bce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.402509 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-config-data" (OuterVolumeSpecName: "config-data") pod "f4e2ba1e-bb1f-4770-a261-979b3f467bce" (UID: "f4e2ba1e-bb1f-4770-a261-979b3f467bce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.413671 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4e2ba1e-bb1f-4770-a261-979b3f467bce" (UID: "f4e2ba1e-bb1f-4770-a261-979b3f467bce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.478070 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxczs\" (UniqueName: \"kubernetes.io/projected/f4e2ba1e-bb1f-4770-a261-979b3f467bce-kube-api-access-bxczs\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.478115 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.478126 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.478137 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e2ba1e-bb1f-4770-a261-979b3f467bce-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.902240 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-55974" event={"ID":"f4e2ba1e-bb1f-4770-a261-979b3f467bce","Type":"ContainerDied","Data":"ea691c7ed8e6a9193abbd12ff0c0bd4fa21833deca6be248255600315c7f9d24"} Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.902444 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea691c7ed8e6a9193abbd12ff0c0bd4fa21833deca6be248255600315c7f9d24" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.902479 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-55974" Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.905494 4658 generic.go:334] "Generic (PLEG): container finished" podID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerID="19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2" exitCode=0 Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.905532 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dbf7b8b8b-kj6xr" event={"ID":"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2","Type":"ContainerDied","Data":"19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2"} Oct 02 11:38:48 crc kubenswrapper[4658]: I1002 11:38:48.905568 4658 scope.go:117] "RemoveContainer" containerID="6d45f089b45e50f886b377a7177e755f763adec478d0b95d9b7dd867cd3a61a8" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.002053 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:38:49 crc kubenswrapper[4658]: E1002 11:38:49.002554 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e2ba1e-bb1f-4770-a261-979b3f467bce" containerName="nova-cell0-conductor-db-sync" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.002579 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e2ba1e-bb1f-4770-a261-979b3f467bce" containerName="nova-cell0-conductor-db-sync" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.002822 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e2ba1e-bb1f-4770-a261-979b3f467bce" containerName="nova-cell0-conductor-db-sync" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.003705 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.009014 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.009204 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-k4sqh" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.010731 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.090051 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kcw6\" (UniqueName: \"kubernetes.io/projected/8441c161-18f6-46d9-a327-ac3857d077d2-kube-api-access-2kcw6\") pod \"nova-cell0-conductor-0\" (UID: \"8441c161-18f6-46d9-a327-ac3857d077d2\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.090141 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8441c161-18f6-46d9-a327-ac3857d077d2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8441c161-18f6-46d9-a327-ac3857d077d2\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.090223 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8441c161-18f6-46d9-a327-ac3857d077d2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8441c161-18f6-46d9-a327-ac3857d077d2\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.194513 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8441c161-18f6-46d9-a327-ac3857d077d2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8441c161-18f6-46d9-a327-ac3857d077d2\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.194633 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8441c161-18f6-46d9-a327-ac3857d077d2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8441c161-18f6-46d9-a327-ac3857d077d2\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.194757 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kcw6\" (UniqueName: \"kubernetes.io/projected/8441c161-18f6-46d9-a327-ac3857d077d2-kube-api-access-2kcw6\") pod \"nova-cell0-conductor-0\" (UID: \"8441c161-18f6-46d9-a327-ac3857d077d2\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.199974 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8441c161-18f6-46d9-a327-ac3857d077d2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8441c161-18f6-46d9-a327-ac3857d077d2\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.201388 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8441c161-18f6-46d9-a327-ac3857d077d2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8441c161-18f6-46d9-a327-ac3857d077d2\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.212868 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kcw6\" (UniqueName: \"kubernetes.io/projected/8441c161-18f6-46d9-a327-ac3857d077d2-kube-api-access-2kcw6\") pod \"nova-cell0-conductor-0\" (UID: \"8441c161-18f6-46d9-a327-ac3857d077d2\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.339552 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.544033 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.802036 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.923946 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerStarted","Data":"c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f"} Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.924062 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:38:49 crc kubenswrapper[4658]: I1002 11:38:49.938997 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8441c161-18f6-46d9-a327-ac3857d077d2","Type":"ContainerStarted","Data":"8469dceec94529e13cd3104202bf44db5aaf54d52d5ed15cb67c84aaf7d31160"} Oct 02 11:38:50 crc kubenswrapper[4658]: I1002 11:38:50.026190 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.179991361 podStartE2EDuration="6.026170232s" podCreationTimestamp="2025-10-02 11:38:44 +0000 UTC" firstStartedPulling="2025-10-02 11:38:45.613035772 +0000 UTC m=+1206.504189349" lastFinishedPulling="2025-10-02 11:38:49.459214653 +0000 UTC m=+1210.350368220" observedRunningTime="2025-10-02 11:38:49.992630911 +0000 UTC m=+1210.883784498" watchObservedRunningTime="2025-10-02 11:38:50.026170232 +0000 UTC m=+1210.917323799" Oct 02 11:38:50 crc kubenswrapper[4658]: I1002 11:38:50.966319 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8441c161-18f6-46d9-a327-ac3857d077d2","Type":"ContainerStarted","Data":"7dc11e93928e8ac4c1362b6cce32d387a4d42a4584617bacdc1cf5546bf732b1"} Oct 02 11:38:50 crc kubenswrapper[4658]: I1002 11:38:50.966689 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:51 crc kubenswrapper[4658]: I1002 11:38:51.003812 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.003789407 podStartE2EDuration="3.003789407s" podCreationTimestamp="2025-10-02 11:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:38:50.980946944 +0000 UTC m=+1211.872100531" watchObservedRunningTime="2025-10-02 11:38:51.003789407 +0000 UTC m=+1211.894942974" Oct 02 11:38:59 crc kubenswrapper[4658]: I1002 11:38:59.375021 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 11:38:59 crc kubenswrapper[4658]: I1002 11:38:59.543648 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Oct 02 11:38:59 crc kubenswrapper[4658]: I1002 11:38:59.880198 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-skqqc"] Oct 02 11:38:59 crc kubenswrapper[4658]: I1002 11:38:59.881477 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:38:59 crc kubenswrapper[4658]: I1002 11:38:59.883356 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 02 11:38:59 crc kubenswrapper[4658]: I1002 11:38:59.884668 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 02 11:38:59 crc kubenswrapper[4658]: I1002 11:38:59.899590 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-skqqc"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.011572 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.011676 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6pbv\" (UniqueName: \"kubernetes.io/projected/86fefaf1-a889-4b79-b9bf-e53d04639c2e-kube-api-access-h6pbv\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.011734 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-config-data\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.011793 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-scripts\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.113604 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.113681 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6pbv\" (UniqueName: \"kubernetes.io/projected/86fefaf1-a889-4b79-b9bf-e53d04639c2e-kube-api-access-h6pbv\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.113730 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-config-data\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.113777 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-scripts\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.123611 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.124665 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-scripts\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.139930 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-config-data\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.161515 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6pbv\" (UniqueName: \"kubernetes.io/projected/86fefaf1-a889-4b79-b9bf-e53d04639c2e-kube-api-access-h6pbv\") pod \"nova-cell0-cell-mapping-skqqc\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.202390 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.204148 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.205265 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.250356 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.252347 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.298341 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.311251 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.318728 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.319340 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f053300-30e6-48b6-b474-9e6cba4dbeb4-logs\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.319411 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-logs\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.319478 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-config-data\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.319501 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfgsg\" (UniqueName: \"kubernetes.io/projected/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-kube-api-access-pfgsg\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.319533 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z689s\" (UniqueName: \"kubernetes.io/projected/2f053300-30e6-48b6-b474-9e6cba4dbeb4-kube-api-access-z689s\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.319604 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.319677 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.319711 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-config-data\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.344179 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.346006 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.353916 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.375341 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.406381 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.407655 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421201 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f053300-30e6-48b6-b474-9e6cba4dbeb4-logs\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421278 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421377 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-logs\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421403 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-config-data\") pod \"nova-scheduler-0\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421439 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62vb2\" (UniqueName: \"kubernetes.io/projected/16004f96-9ecf-4206-adcc-c62ee45dca24-kube-api-access-62vb2\") pod \"nova-scheduler-0\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421475 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-config-data\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421500 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfgsg\" (UniqueName: \"kubernetes.io/projected/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-kube-api-access-pfgsg\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421536 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z689s\" (UniqueName: \"kubernetes.io/projected/2f053300-30e6-48b6-b474-9e6cba4dbeb4-kube-api-access-z689s\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421594 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdz9\" (UniqueName: \"kubernetes.io/projected/4099dbad-5133-4954-9bf5-1131c1d0164a-kube-api-access-6tdz9\") pod \"nova-cell1-novncproxy-0\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421626 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421651 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421683 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421714 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.421751 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-config-data\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.425810 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-config-data\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.426274 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f053300-30e6-48b6-b474-9e6cba4dbeb4-logs\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.426640 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-logs\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.436409 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.438462 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.444903 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.465616 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.468230 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-config-data\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.486460 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfgsg\" (UniqueName: \"kubernetes.io/projected/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-kube-api-access-pfgsg\") pod \"nova-metadata-0\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.488464 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.491323 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z689s\" (UniqueName: \"kubernetes.io/projected/2f053300-30e6-48b6-b474-9e6cba4dbeb4-kube-api-access-z689s\") pod \"nova-api-0\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.526269 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-config-data\") pod \"nova-scheduler-0\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.526347 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62vb2\" (UniqueName: \"kubernetes.io/projected/16004f96-9ecf-4206-adcc-c62ee45dca24-kube-api-access-62vb2\") pod \"nova-scheduler-0\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.526469 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdz9\" (UniqueName: \"kubernetes.io/projected/4099dbad-5133-4954-9bf5-1131c1d0164a-kube-api-access-6tdz9\") pod \"nova-cell1-novncproxy-0\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.526496 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.526535 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.526626 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.533099 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.535188 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-config-data\") pod \"nova-scheduler-0\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.535191 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.537190 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.553103 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62vb2\" (UniqueName: \"kubernetes.io/projected/16004f96-9ecf-4206-adcc-c62ee45dca24-kube-api-access-62vb2\") pod \"nova-scheduler-0\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.556871 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.559591 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdz9\" (UniqueName: \"kubernetes.io/projected/4099dbad-5133-4954-9bf5-1131c1d0164a-kube-api-access-6tdz9\") pod \"nova-cell1-novncproxy-0\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.615381 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6xf49"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.617125 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.657846 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6xf49"] Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.740631 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.740614 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.741042 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.741087 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.741111 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.741166 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ddd5\" (UniqueName: \"kubernetes.io/projected/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-kube-api-access-7ddd5\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.741250 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-config\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.781915 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.848885 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.856846 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ddd5\" (UniqueName: \"kubernetes.io/projected/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-kube-api-access-7ddd5\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.856969 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-config\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.857131 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.857191 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.857228 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.857253 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.858106 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.858191 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.858741 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-config\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.859262 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.859392 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.888045 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ddd5\" (UniqueName: \"kubernetes.io/projected/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-kube-api-access-7ddd5\") pod \"dnsmasq-dns-845d6d6f59-6xf49\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:00 crc kubenswrapper[4658]: I1002 11:39:00.969147 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.082391 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-skqqc"] Oct 02 11:39:01 crc kubenswrapper[4658]: W1002 11:39:01.126018 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86fefaf1_a889_4b79_b9bf_e53d04639c2e.slice/crio-9f8543770db1179dacd387732ed989c8198bceeea2b851dc11b309b5110bf30c WatchSource:0}: Error finding container 9f8543770db1179dacd387732ed989c8198bceeea2b851dc11b309b5110bf30c: Status 404 returned error can't find the container with id 9f8543770db1179dacd387732ed989c8198bceeea2b851dc11b309b5110bf30c Oct 02 11:39:01 crc kubenswrapper[4658]: W1002 11:39:01.287780 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16004f96_9ecf_4206_adcc_c62ee45dca24.slice/crio-889597d972d7d5ee20a93e6c0a917ca9c32def0f0341a215c0c80a0f72148d03 WatchSource:0}: Error finding container 889597d972d7d5ee20a93e6c0a917ca9c32def0f0341a215c0c80a0f72148d03: Status 404 returned error can't find the container with id 889597d972d7d5ee20a93e6c0a917ca9c32def0f0341a215c0c80a0f72148d03 Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.313075 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.562214 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.678666 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9jhhj"] Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.680643 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.684381 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.689563 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.736420 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9jhhj"] Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.770147 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.790943 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-scripts\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.791001 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-config-data\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.791178 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.791214 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgkr\" (UniqueName: \"kubernetes.io/projected/58a2866b-59b0-47dc-b036-cb6f5c08bd40-kube-api-access-7lgkr\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.816646 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:39:01 crc kubenswrapper[4658]: W1002 11:39:01.820534 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4099dbad_5133_4954_9bf5_1131c1d0164a.slice/crio-186b434da1fd392228694b359b4b121fef9e4f6f38ee395d5bac0e6169a98d3e WatchSource:0}: Error finding container 186b434da1fd392228694b359b4b121fef9e4f6f38ee395d5bac0e6169a98d3e: Status 404 returned error can't find the container with id 186b434da1fd392228694b359b4b121fef9e4f6f38ee395d5bac0e6169a98d3e Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.832489 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6xf49"] Oct 02 11:39:01 crc kubenswrapper[4658]: W1002 11:39:01.836496 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51ba6acd_c67c_4f97_aea8_0121cb4bd4a2.slice/crio-97cf79ef180cbc62fc353da2b7aa25b7fee3b1f1a867117b5a6b729e608e5ae9 WatchSource:0}: Error finding container 97cf79ef180cbc62fc353da2b7aa25b7fee3b1f1a867117b5a6b729e608e5ae9: Status 404 returned error can't find the container with id 97cf79ef180cbc62fc353da2b7aa25b7fee3b1f1a867117b5a6b729e608e5ae9 Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.892530 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.892577 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgkr\" (UniqueName: \"kubernetes.io/projected/58a2866b-59b0-47dc-b036-cb6f5c08bd40-kube-api-access-7lgkr\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.892710 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-scripts\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.892767 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-config-data\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.906451 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.906638 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-config-data\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.909743 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-scripts\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:01 crc kubenswrapper[4658]: I1002 11:39:01.928018 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgkr\" (UniqueName: \"kubernetes.io/projected/58a2866b-59b0-47dc-b036-cb6f5c08bd40-kube-api-access-7lgkr\") pod \"nova-cell1-conductor-db-sync-9jhhj\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.028135 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.137447 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-skqqc" event={"ID":"86fefaf1-a889-4b79-b9bf-e53d04639c2e","Type":"ContainerStarted","Data":"091b5f3ac43ac2ce0f6435c8dcd12c903bae95419f321d90aef8537cfb4b423e"} Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.137506 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-skqqc" event={"ID":"86fefaf1-a889-4b79-b9bf-e53d04639c2e","Type":"ContainerStarted","Data":"9f8543770db1179dacd387732ed989c8198bceeea2b851dc11b309b5110bf30c"} Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.139388 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d3f3b46-3130-49ca-a5a0-67bcf53277d6","Type":"ContainerStarted","Data":"4a98143d8d4019ec3eebd6324c3de13b88ac48b46d75b6fefaa09b095de9388c"} Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.142051 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f053300-30e6-48b6-b474-9e6cba4dbeb4","Type":"ContainerStarted","Data":"579cb0c19595e672d8ba89c592844444a1d971187aebbcfa5ca53148fc23629a"} Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.144617 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4099dbad-5133-4954-9bf5-1131c1d0164a","Type":"ContainerStarted","Data":"186b434da1fd392228694b359b4b121fef9e4f6f38ee395d5bac0e6169a98d3e"} Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.146240 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"16004f96-9ecf-4206-adcc-c62ee45dca24","Type":"ContainerStarted","Data":"889597d972d7d5ee20a93e6c0a917ca9c32def0f0341a215c0c80a0f72148d03"} Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.167831 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" event={"ID":"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2","Type":"ContainerStarted","Data":"6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0"} Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.167878 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" event={"ID":"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2","Type":"ContainerStarted","Data":"97cf79ef180cbc62fc353da2b7aa25b7fee3b1f1a867117b5a6b729e608e5ae9"} Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.169604 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-skqqc" podStartSLOduration=3.169590216 podStartE2EDuration="3.169590216s" podCreationTimestamp="2025-10-02 11:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:02.152054736 +0000 UTC m=+1223.043208313" watchObservedRunningTime="2025-10-02 11:39:02.169590216 +0000 UTC m=+1223.060743783" Oct 02 11:39:02 crc kubenswrapper[4658]: I1002 11:39:02.425182 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9jhhj"] Oct 02 11:39:02 crc kubenswrapper[4658]: W1002 11:39:02.444007 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58a2866b_59b0_47dc_b036_cb6f5c08bd40.slice/crio-91a185f83670ede57daaff63255abeebba97c8783f166275dde3a9475f41a17e WatchSource:0}: Error finding container 91a185f83670ede57daaff63255abeebba97c8783f166275dde3a9475f41a17e: Status 404 returned error can't find the container with id 91a185f83670ede57daaff63255abeebba97c8783f166275dde3a9475f41a17e Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.190026 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9jhhj" event={"ID":"58a2866b-59b0-47dc-b036-cb6f5c08bd40","Type":"ContainerStarted","Data":"5c5ec061387b5c2f557bb6fafd067a897cbb290cd247047bc8df3d28fc67117a"} Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.190745 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9jhhj" event={"ID":"58a2866b-59b0-47dc-b036-cb6f5c08bd40","Type":"ContainerStarted","Data":"91a185f83670ede57daaff63255abeebba97c8783f166275dde3a9475f41a17e"} Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.197970 4658 generic.go:334] "Generic (PLEG): container finished" podID="51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" containerID="6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0" exitCode=0 Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.198869 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" event={"ID":"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2","Type":"ContainerDied","Data":"6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0"} Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.198902 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" event={"ID":"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2","Type":"ContainerStarted","Data":"7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f"} Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.199165 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.235906 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9jhhj" podStartSLOduration=2.235884823 podStartE2EDuration="2.235884823s" podCreationTimestamp="2025-10-02 11:39:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:03.218017483 +0000 UTC m=+1224.109171050" watchObservedRunningTime="2025-10-02 11:39:03.235884823 +0000 UTC m=+1224.127038380" Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.267314 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" podStartSLOduration=3.2672793540000002 podStartE2EDuration="3.267279354s" podCreationTimestamp="2025-10-02 11:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:03.250726436 +0000 UTC m=+1224.141880003" watchObservedRunningTime="2025-10-02 11:39:03.267279354 +0000 UTC m=+1224.158432921" Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.867411 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:03 crc kubenswrapper[4658]: I1002 11:39:03.875804 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.239346 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"16004f96-9ecf-4206-adcc-c62ee45dca24","Type":"ContainerStarted","Data":"9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298"} Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.241646 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d3f3b46-3130-49ca-a5a0-67bcf53277d6","Type":"ContainerStarted","Data":"4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1"} Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.241693 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d3f3b46-3130-49ca-a5a0-67bcf53277d6","Type":"ContainerStarted","Data":"bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394"} Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.241756 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerName="nova-metadata-metadata" containerID="cri-o://4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1" gracePeriod=30 Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.241793 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerName="nova-metadata-log" containerID="cri-o://bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394" gracePeriod=30 Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.245249 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f053300-30e6-48b6-b474-9e6cba4dbeb4","Type":"ContainerStarted","Data":"ab6d168301b817650aa67973968b696bc39642aff80bdad71f1f7dc2533d13d7"} Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.245348 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f053300-30e6-48b6-b474-9e6cba4dbeb4","Type":"ContainerStarted","Data":"f7548f54c3767b0055a3b5321b62868f5624928bd8654d1e5a9409d5abce8619"} Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.248757 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4099dbad-5133-4954-9bf5-1131c1d0164a","Type":"ContainerStarted","Data":"862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6"} Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.248896 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4099dbad-5133-4954-9bf5-1131c1d0164a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6" gracePeriod=30 Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.265336 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.049761381 podStartE2EDuration="6.265318728s" podCreationTimestamp="2025-10-02 11:39:00 +0000 UTC" firstStartedPulling="2025-10-02 11:39:01.29133737 +0000 UTC m=+1222.182490947" lastFinishedPulling="2025-10-02 11:39:05.506894727 +0000 UTC m=+1226.398048294" observedRunningTime="2025-10-02 11:39:06.254489946 +0000 UTC m=+1227.145643523" watchObservedRunningTime="2025-10-02 11:39:06.265318728 +0000 UTC m=+1227.156472295" Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.286512 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.329684581 podStartE2EDuration="6.286492997s" podCreationTimestamp="2025-10-02 11:39:00 +0000 UTC" firstStartedPulling="2025-10-02 11:39:01.555461766 +0000 UTC m=+1222.446615333" lastFinishedPulling="2025-10-02 11:39:05.512270192 +0000 UTC m=+1226.403423749" observedRunningTime="2025-10-02 11:39:06.273488934 +0000 UTC m=+1227.164642501" watchObservedRunningTime="2025-10-02 11:39:06.286492997 +0000 UTC m=+1227.177646564" Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.301219 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.565662229 podStartE2EDuration="6.301201344s" podCreationTimestamp="2025-10-02 11:39:00 +0000 UTC" firstStartedPulling="2025-10-02 11:39:01.771530098 +0000 UTC m=+1222.662683665" lastFinishedPulling="2025-10-02 11:39:05.507069213 +0000 UTC m=+1226.398222780" observedRunningTime="2025-10-02 11:39:06.294535638 +0000 UTC m=+1227.185689205" watchObservedRunningTime="2025-10-02 11:39:06.301201344 +0000 UTC m=+1227.192354911" Oct 02 11:39:06 crc kubenswrapper[4658]: I1002 11:39:06.317013 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.633843735 podStartE2EDuration="6.316993428s" podCreationTimestamp="2025-10-02 11:39:00 +0000 UTC" firstStartedPulling="2025-10-02 11:39:01.825097619 +0000 UTC m=+1222.716251186" lastFinishedPulling="2025-10-02 11:39:05.508247312 +0000 UTC m=+1226.399400879" observedRunningTime="2025-10-02 11:39:06.313168604 +0000 UTC m=+1227.204322191" watchObservedRunningTime="2025-10-02 11:39:06.316993428 +0000 UTC m=+1227.208147005" Oct 02 11:39:07 crc kubenswrapper[4658]: I1002 11:39:07.260350 4658 generic.go:334] "Generic (PLEG): container finished" podID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerID="bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394" exitCode=143 Oct 02 11:39:07 crc kubenswrapper[4658]: I1002 11:39:07.260520 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d3f3b46-3130-49ca-a5a0-67bcf53277d6","Type":"ContainerDied","Data":"bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394"} Oct 02 11:39:09 crc kubenswrapper[4658]: I1002 11:39:09.543919 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6dbf7b8b8b-kj6xr" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Oct 02 11:39:10 crc kubenswrapper[4658]: I1002 11:39:10.558579 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:39:10 crc kubenswrapper[4658]: I1002 11:39:10.559213 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:39:10 crc kubenswrapper[4658]: I1002 11:39:10.596077 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:39:10 crc kubenswrapper[4658]: I1002 11:39:10.741681 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:39:10 crc kubenswrapper[4658]: I1002 11:39:10.741963 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:39:10 crc kubenswrapper[4658]: I1002 11:39:10.783416 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:39:10 crc kubenswrapper[4658]: I1002 11:39:10.783727 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:39:10 crc kubenswrapper[4658]: I1002 11:39:10.849850 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:10 crc kubenswrapper[4658]: I1002 11:39:10.971559 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.092024 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-kw9hk"] Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.092932 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" podUID="ff74bfb7-1171-47ce-acb3-df2b35d0ca20" containerName="dnsmasq-dns" containerID="cri-o://95cb18122631cb565b73c262d44f302c384ed175ad61263ef04eb1ab2006875c" gracePeriod=10 Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.307881 4658 generic.go:334] "Generic (PLEG): container finished" podID="86fefaf1-a889-4b79-b9bf-e53d04639c2e" containerID="091b5f3ac43ac2ce0f6435c8dcd12c903bae95419f321d90aef8537cfb4b423e" exitCode=0 Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.307958 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-skqqc" event={"ID":"86fefaf1-a889-4b79-b9bf-e53d04639c2e","Type":"ContainerDied","Data":"091b5f3ac43ac2ce0f6435c8dcd12c903bae95419f321d90aef8537cfb4b423e"} Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.321113 4658 generic.go:334] "Generic (PLEG): container finished" podID="ff74bfb7-1171-47ce-acb3-df2b35d0ca20" containerID="95cb18122631cb565b73c262d44f302c384ed175ad61263ef04eb1ab2006875c" exitCode=0 Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.321362 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" event={"ID":"ff74bfb7-1171-47ce-acb3-df2b35d0ca20","Type":"ContainerDied","Data":"95cb18122631cb565b73c262d44f302c384ed175ad61263ef04eb1ab2006875c"} Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.369237 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.770926 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.816323 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-nb\") pod \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.816687 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-sb\") pod \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.816744 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-svc\") pod \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.816764 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-config\") pod \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.816799 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b5bn\" (UniqueName: \"kubernetes.io/projected/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-kube-api-access-8b5bn\") pod \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.816825 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-swift-storage-0\") pod \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\" (UID: \"ff74bfb7-1171-47ce-acb3-df2b35d0ca20\") " Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.832399 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.832474 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.846881 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-kube-api-access-8b5bn" (OuterVolumeSpecName: "kube-api-access-8b5bn") pod "ff74bfb7-1171-47ce-acb3-df2b35d0ca20" (UID: "ff74bfb7-1171-47ce-acb3-df2b35d0ca20"). InnerVolumeSpecName "kube-api-access-8b5bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.905986 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff74bfb7-1171-47ce-acb3-df2b35d0ca20" (UID: "ff74bfb7-1171-47ce-acb3-df2b35d0ca20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.912666 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-config" (OuterVolumeSpecName: "config") pod "ff74bfb7-1171-47ce-acb3-df2b35d0ca20" (UID: "ff74bfb7-1171-47ce-acb3-df2b35d0ca20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.918583 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ff74bfb7-1171-47ce-acb3-df2b35d0ca20" (UID: "ff74bfb7-1171-47ce-acb3-df2b35d0ca20"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.920008 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.920051 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.920066 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.920078 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b5bn\" (UniqueName: \"kubernetes.io/projected/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-kube-api-access-8b5bn\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.930896 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff74bfb7-1171-47ce-acb3-df2b35d0ca20" (UID: "ff74bfb7-1171-47ce-acb3-df2b35d0ca20"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:11 crc kubenswrapper[4658]: I1002 11:39:11.983772 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff74bfb7-1171-47ce-acb3-df2b35d0ca20" (UID: "ff74bfb7-1171-47ce-acb3-df2b35d0ca20"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.024934 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.024975 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff74bfb7-1171-47ce-acb3-df2b35d0ca20-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.334254 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.334745 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-kw9hk" event={"ID":"ff74bfb7-1171-47ce-acb3-df2b35d0ca20","Type":"ContainerDied","Data":"5023168be63d57ee4afc23a614d4bc6d54ca9f19ccf0cc1d153982351d00694b"} Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.334795 4658 scope.go:117] "RemoveContainer" containerID="95cb18122631cb565b73c262d44f302c384ed175ad61263ef04eb1ab2006875c" Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.372181 4658 scope.go:117] "RemoveContainer" containerID="4755c097e2730a1e59f62856b8223dd41a73976014ca7b625ffb481dbf72a05e" Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.382185 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-kw9hk"] Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.392733 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-kw9hk"] Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.785200 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.961941 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-config-data\") pod \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.962323 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-scripts\") pod \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.962553 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6pbv\" (UniqueName: \"kubernetes.io/projected/86fefaf1-a889-4b79-b9bf-e53d04639c2e-kube-api-access-h6pbv\") pod \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.962613 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-combined-ca-bundle\") pod \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\" (UID: \"86fefaf1-a889-4b79-b9bf-e53d04639c2e\") " Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.967138 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86fefaf1-a889-4b79-b9bf-e53d04639c2e-kube-api-access-h6pbv" (OuterVolumeSpecName: "kube-api-access-h6pbv") pod "86fefaf1-a889-4b79-b9bf-e53d04639c2e" (UID: "86fefaf1-a889-4b79-b9bf-e53d04639c2e"). InnerVolumeSpecName "kube-api-access-h6pbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:12 crc kubenswrapper[4658]: I1002 11:39:12.967737 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-scripts" (OuterVolumeSpecName: "scripts") pod "86fefaf1-a889-4b79-b9bf-e53d04639c2e" (UID: "86fefaf1-a889-4b79-b9bf-e53d04639c2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.004446 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-config-data" (OuterVolumeSpecName: "config-data") pod "86fefaf1-a889-4b79-b9bf-e53d04639c2e" (UID: "86fefaf1-a889-4b79-b9bf-e53d04639c2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.008077 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86fefaf1-a889-4b79-b9bf-e53d04639c2e" (UID: "86fefaf1-a889-4b79-b9bf-e53d04639c2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.066264 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.066322 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.066337 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fefaf1-a889-4b79-b9bf-e53d04639c2e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.066350 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6pbv\" (UniqueName: \"kubernetes.io/projected/86fefaf1-a889-4b79-b9bf-e53d04639c2e-kube-api-access-h6pbv\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.349176 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-skqqc" event={"ID":"86fefaf1-a889-4b79-b9bf-e53d04639c2e","Type":"ContainerDied","Data":"9f8543770db1179dacd387732ed989c8198bceeea2b851dc11b309b5110bf30c"} Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.349228 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f8543770db1179dacd387732ed989c8198bceeea2b851dc11b309b5110bf30c" Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.349323 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-skqqc" Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.516012 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.516518 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-api" containerID="cri-o://ab6d168301b817650aa67973968b696bc39642aff80bdad71f1f7dc2533d13d7" gracePeriod=30 Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.516390 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-log" containerID="cri-o://f7548f54c3767b0055a3b5321b62868f5624928bd8654d1e5a9409d5abce8619" gracePeriod=30 Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.532912 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.533461 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="16004f96-9ecf-4206-adcc-c62ee45dca24" containerName="nova-scheduler-scheduler" containerID="cri-o://9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298" gracePeriod=30 Oct 02 11:39:13 crc kubenswrapper[4658]: I1002 11:39:13.962018 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff74bfb7-1171-47ce-acb3-df2b35d0ca20" path="/var/lib/kubelet/pods/ff74bfb7-1171-47ce-acb3-df2b35d0ca20/volumes" Oct 02 11:39:14 crc kubenswrapper[4658]: I1002 11:39:14.367558 4658 generic.go:334] "Generic (PLEG): container finished" podID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerID="f7548f54c3767b0055a3b5321b62868f5624928bd8654d1e5a9409d5abce8619" exitCode=143 Oct 02 11:39:14 crc kubenswrapper[4658]: I1002 11:39:14.367606 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f053300-30e6-48b6-b474-9e6cba4dbeb4","Type":"ContainerDied","Data":"f7548f54c3767b0055a3b5321b62868f5624928bd8654d1e5a9409d5abce8619"} Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.186046 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.306824 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-secret-key\") pod \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.306886 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-scripts\") pod \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.306926 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-logs\") pod \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.306957 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-combined-ca-bundle\") pod \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.307021 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-config-data\") pod \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.307044 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrk2z\" (UniqueName: \"kubernetes.io/projected/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-kube-api-access-zrk2z\") pod \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.307142 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-tls-certs\") pod \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\" (UID: \"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2\") " Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.307831 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-logs" (OuterVolumeSpecName: "logs") pod "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" (UID: "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.318862 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-kube-api-access-zrk2z" (OuterVolumeSpecName: "kube-api-access-zrk2z") pod "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" (UID: "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2"). InnerVolumeSpecName "kube-api-access-zrk2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.338064 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" (UID: "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.343874 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-scripts" (OuterVolumeSpecName: "scripts") pod "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" (UID: "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.348962 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-config-data" (OuterVolumeSpecName: "config-data") pod "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" (UID: "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.363108 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" (UID: "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.385607 4658 generic.go:334] "Generic (PLEG): container finished" podID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerID="aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd" exitCode=137 Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.385667 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dbf7b8b8b-kj6xr" event={"ID":"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2","Type":"ContainerDied","Data":"aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd"} Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.385695 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dbf7b8b8b-kj6xr" event={"ID":"7679dd1e-82a5-47eb-83f3-08a1e0cab3c2","Type":"ContainerDied","Data":"4742d9384407970bd1c5bec11bf52283c83e90ec7fee62e6559d9b338d3bd304"} Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.385711 4658 scope.go:117] "RemoveContainer" containerID="19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.385887 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dbf7b8b8b-kj6xr" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.394469 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" (UID: "7679dd1e-82a5-47eb-83f3-08a1e0cab3c2"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.410711 4658 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.410760 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.410772 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.410783 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.410797 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.410808 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrk2z\" (UniqueName: \"kubernetes.io/projected/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-kube-api-access-zrk2z\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.410820 4658 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.440269 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:39:15 crc kubenswrapper[4658]: E1002 11:39:15.559352 4658 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:39:15 crc kubenswrapper[4658]: E1002 11:39:15.564771 4658 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:39:15 crc kubenswrapper[4658]: E1002 11:39:15.594140 4658 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:39:15 crc kubenswrapper[4658]: E1002 11:39:15.594203 4658 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="16004f96-9ecf-4206-adcc-c62ee45dca24" containerName="nova-scheduler-scheduler" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.637359 4658 scope.go:117] "RemoveContainer" containerID="aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.664073 4658 scope.go:117] "RemoveContainer" containerID="19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2" Oct 02 11:39:15 crc kubenswrapper[4658]: E1002 11:39:15.664494 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2\": container with ID starting with 19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2 not found: ID does not exist" containerID="19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.664539 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2"} err="failed to get container status \"19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2\": rpc error: code = NotFound desc = could not find container \"19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2\": container with ID starting with 19f9093cf2e92c8048ee39a88f32f6447081020eb56f44604c805236630f92e2 not found: ID does not exist" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.664567 4658 scope.go:117] "RemoveContainer" containerID="aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd" Oct 02 11:39:15 crc kubenswrapper[4658]: E1002 11:39:15.664996 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd\": container with ID starting with aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd not found: ID does not exist" containerID="aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.665034 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd"} err="failed to get container status \"aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd\": rpc error: code = NotFound desc = could not find container \"aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd\": container with ID starting with aa4933124f53907d5e4e4511f83426ba8e1fbcbdda988dd1976cf71c7b8a2fdd not found: ID does not exist" Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.720360 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6dbf7b8b8b-kj6xr"] Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.735847 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6dbf7b8b8b-kj6xr"] Oct 02 11:39:15 crc kubenswrapper[4658]: I1002 11:39:15.960751 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" path="/var/lib/kubelet/pods/7679dd1e-82a5-47eb-83f3-08a1e0cab3c2/volumes" Oct 02 11:39:16 crc kubenswrapper[4658]: I1002 11:39:16.400703 4658 generic.go:334] "Generic (PLEG): container finished" podID="58a2866b-59b0-47dc-b036-cb6f5c08bd40" containerID="5c5ec061387b5c2f557bb6fafd067a897cbb290cd247047bc8df3d28fc67117a" exitCode=0 Oct 02 11:39:16 crc kubenswrapper[4658]: I1002 11:39:16.400800 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9jhhj" event={"ID":"58a2866b-59b0-47dc-b036-cb6f5c08bd40","Type":"ContainerDied","Data":"5c5ec061387b5c2f557bb6fafd067a897cbb290cd247047bc8df3d28fc67117a"} Oct 02 11:39:17 crc kubenswrapper[4658]: I1002 11:39:17.920959 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:17 crc kubenswrapper[4658]: I1002 11:39:17.928452 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.072138 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lgkr\" (UniqueName: \"kubernetes.io/projected/58a2866b-59b0-47dc-b036-cb6f5c08bd40-kube-api-access-7lgkr\") pod \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.072245 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-combined-ca-bundle\") pod \"16004f96-9ecf-4206-adcc-c62ee45dca24\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.072311 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62vb2\" (UniqueName: \"kubernetes.io/projected/16004f96-9ecf-4206-adcc-c62ee45dca24-kube-api-access-62vb2\") pod \"16004f96-9ecf-4206-adcc-c62ee45dca24\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.073216 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-combined-ca-bundle\") pod \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.073268 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-config-data\") pod \"16004f96-9ecf-4206-adcc-c62ee45dca24\" (UID: \"16004f96-9ecf-4206-adcc-c62ee45dca24\") " Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.073328 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-scripts\") pod \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.073387 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-config-data\") pod \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\" (UID: \"58a2866b-59b0-47dc-b036-cb6f5c08bd40\") " Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.084567 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-scripts" (OuterVolumeSpecName: "scripts") pod "58a2866b-59b0-47dc-b036-cb6f5c08bd40" (UID: "58a2866b-59b0-47dc-b036-cb6f5c08bd40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.084625 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a2866b-59b0-47dc-b036-cb6f5c08bd40-kube-api-access-7lgkr" (OuterVolumeSpecName: "kube-api-access-7lgkr") pod "58a2866b-59b0-47dc-b036-cb6f5c08bd40" (UID: "58a2866b-59b0-47dc-b036-cb6f5c08bd40"). InnerVolumeSpecName "kube-api-access-7lgkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.084653 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16004f96-9ecf-4206-adcc-c62ee45dca24-kube-api-access-62vb2" (OuterVolumeSpecName: "kube-api-access-62vb2") pod "16004f96-9ecf-4206-adcc-c62ee45dca24" (UID: "16004f96-9ecf-4206-adcc-c62ee45dca24"). InnerVolumeSpecName "kube-api-access-62vb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.109386 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-config-data" (OuterVolumeSpecName: "config-data") pod "58a2866b-59b0-47dc-b036-cb6f5c08bd40" (UID: "58a2866b-59b0-47dc-b036-cb6f5c08bd40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.113507 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16004f96-9ecf-4206-adcc-c62ee45dca24" (UID: "16004f96-9ecf-4206-adcc-c62ee45dca24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.118437 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58a2866b-59b0-47dc-b036-cb6f5c08bd40" (UID: "58a2866b-59b0-47dc-b036-cb6f5c08bd40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.119991 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-config-data" (OuterVolumeSpecName: "config-data") pod "16004f96-9ecf-4206-adcc-c62ee45dca24" (UID: "16004f96-9ecf-4206-adcc-c62ee45dca24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.176042 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lgkr\" (UniqueName: \"kubernetes.io/projected/58a2866b-59b0-47dc-b036-cb6f5c08bd40-kube-api-access-7lgkr\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.176076 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.176086 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62vb2\" (UniqueName: \"kubernetes.io/projected/16004f96-9ecf-4206-adcc-c62ee45dca24-kube-api-access-62vb2\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.176095 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.176104 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16004f96-9ecf-4206-adcc-c62ee45dca24-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.176113 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.176128 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a2866b-59b0-47dc-b036-cb6f5c08bd40-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.422771 4658 generic.go:334] "Generic (PLEG): container finished" podID="16004f96-9ecf-4206-adcc-c62ee45dca24" containerID="9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298" exitCode=0 Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.422840 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"16004f96-9ecf-4206-adcc-c62ee45dca24","Type":"ContainerDied","Data":"9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298"} Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.422871 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"16004f96-9ecf-4206-adcc-c62ee45dca24","Type":"ContainerDied","Data":"889597d972d7d5ee20a93e6c0a917ca9c32def0f0341a215c0c80a0f72148d03"} Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.422889 4658 scope.go:117] "RemoveContainer" containerID="9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.422995 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.428098 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9jhhj" event={"ID":"58a2866b-59b0-47dc-b036-cb6f5c08bd40","Type":"ContainerDied","Data":"91a185f83670ede57daaff63255abeebba97c8783f166275dde3a9475f41a17e"} Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.428131 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9jhhj" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.428140 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91a185f83670ede57daaff63255abeebba97c8783f166275dde3a9475f41a17e" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.456826 4658 scope.go:117] "RemoveContainer" containerID="9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298" Oct 02 11:39:18 crc kubenswrapper[4658]: E1002 11:39:18.457330 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298\": container with ID starting with 9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298 not found: ID does not exist" containerID="9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.457369 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298"} err="failed to get container status \"9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298\": rpc error: code = NotFound desc = could not find container \"9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298\": container with ID starting with 9bc1808a90f3328053cb94c0f8a721da721ce4f20a99dab21d55792c6398a298 not found: ID does not exist" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.564908 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.578012 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.592046 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:18 crc kubenswrapper[4658]: E1002 11:39:18.593077 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon-log" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593106 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon-log" Oct 02 11:39:18 crc kubenswrapper[4658]: E1002 11:39:18.593121 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593129 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" Oct 02 11:39:18 crc kubenswrapper[4658]: E1002 11:39:18.593143 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff74bfb7-1171-47ce-acb3-df2b35d0ca20" containerName="dnsmasq-dns" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593151 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff74bfb7-1171-47ce-acb3-df2b35d0ca20" containerName="dnsmasq-dns" Oct 02 11:39:18 crc kubenswrapper[4658]: E1002 11:39:18.593173 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86fefaf1-a889-4b79-b9bf-e53d04639c2e" containerName="nova-manage" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593181 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="86fefaf1-a889-4b79-b9bf-e53d04639c2e" containerName="nova-manage" Oct 02 11:39:18 crc kubenswrapper[4658]: E1002 11:39:18.593197 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16004f96-9ecf-4206-adcc-c62ee45dca24" containerName="nova-scheduler-scheduler" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593205 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="16004f96-9ecf-4206-adcc-c62ee45dca24" containerName="nova-scheduler-scheduler" Oct 02 11:39:18 crc kubenswrapper[4658]: E1002 11:39:18.593231 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a2866b-59b0-47dc-b036-cb6f5c08bd40" containerName="nova-cell1-conductor-db-sync" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593239 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a2866b-59b0-47dc-b036-cb6f5c08bd40" containerName="nova-cell1-conductor-db-sync" Oct 02 11:39:18 crc kubenswrapper[4658]: E1002 11:39:18.593258 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593265 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" Oct 02 11:39:18 crc kubenswrapper[4658]: E1002 11:39:18.593280 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff74bfb7-1171-47ce-acb3-df2b35d0ca20" containerName="init" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593287 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff74bfb7-1171-47ce-acb3-df2b35d0ca20" containerName="init" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593780 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593800 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff74bfb7-1171-47ce-acb3-df2b35d0ca20" containerName="dnsmasq-dns" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593815 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="86fefaf1-a889-4b79-b9bf-e53d04639c2e" containerName="nova-manage" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593829 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a2866b-59b0-47dc-b036-cb6f5c08bd40" containerName="nova-cell1-conductor-db-sync" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593860 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="16004f96-9ecf-4206-adcc-c62ee45dca24" containerName="nova-scheduler-scheduler" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.593874 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon-log" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.594920 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.597659 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.602420 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.621918 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.622866 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="7679dd1e-82a5-47eb-83f3-08a1e0cab3c2" containerName="horizon" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.624144 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.628648 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.637147 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.688387 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-config-data\") pod \"nova-scheduler-0\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.688634 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.688838 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-kube-api-access-5rmkm\") pod \"nova-scheduler-0\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.790614 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pj28\" (UniqueName: \"kubernetes.io/projected/d623c2ea-e4e8-4031-af93-35f76f08dba2-kube-api-access-8pj28\") pod \"nova-cell1-conductor-0\" (UID: \"d623c2ea-e4e8-4031-af93-35f76f08dba2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.791036 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.791107 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d623c2ea-e4e8-4031-af93-35f76f08dba2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d623c2ea-e4e8-4031-af93-35f76f08dba2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.791192 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-kube-api-access-5rmkm\") pod \"nova-scheduler-0\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.791311 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-config-data\") pod \"nova-scheduler-0\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.791386 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d623c2ea-e4e8-4031-af93-35f76f08dba2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d623c2ea-e4e8-4031-af93-35f76f08dba2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.796109 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-config-data\") pod \"nova-scheduler-0\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.796157 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.810062 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-kube-api-access-5rmkm\") pod \"nova-scheduler-0\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.893237 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d623c2ea-e4e8-4031-af93-35f76f08dba2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d623c2ea-e4e8-4031-af93-35f76f08dba2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.893325 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pj28\" (UniqueName: \"kubernetes.io/projected/d623c2ea-e4e8-4031-af93-35f76f08dba2-kube-api-access-8pj28\") pod \"nova-cell1-conductor-0\" (UID: \"d623c2ea-e4e8-4031-af93-35f76f08dba2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.893379 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d623c2ea-e4e8-4031-af93-35f76f08dba2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d623c2ea-e4e8-4031-af93-35f76f08dba2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.897715 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d623c2ea-e4e8-4031-af93-35f76f08dba2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d623c2ea-e4e8-4031-af93-35f76f08dba2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.898702 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d623c2ea-e4e8-4031-af93-35f76f08dba2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d623c2ea-e4e8-4031-af93-35f76f08dba2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.914318 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pj28\" (UniqueName: \"kubernetes.io/projected/d623c2ea-e4e8-4031-af93-35f76f08dba2-kube-api-access-8pj28\") pod \"nova-cell1-conductor-0\" (UID: \"d623c2ea-e4e8-4031-af93-35f76f08dba2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.917115 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:39:18 crc kubenswrapper[4658]: I1002 11:39:18.947152 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.506487 4658 generic.go:334] "Generic (PLEG): container finished" podID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerID="ab6d168301b817650aa67973968b696bc39642aff80bdad71f1f7dc2533d13d7" exitCode=0 Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.506898 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f053300-30e6-48b6-b474-9e6cba4dbeb4","Type":"ContainerDied","Data":"ab6d168301b817650aa67973968b696bc39642aff80bdad71f1f7dc2533d13d7"} Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.541209 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.541427 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3d138ce0-7164-4e2f-9690-83719e55b301" containerName="kube-state-metrics" containerID="cri-o://2a485927051fda1e74c45cc634305f6bea335369dbb3237494fd65af75528a2b" gracePeriod=30 Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.652386 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.765145 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.778230 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.938598 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z689s\" (UniqueName: \"kubernetes.io/projected/2f053300-30e6-48b6-b474-9e6cba4dbeb4-kube-api-access-z689s\") pod \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.938988 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-config-data\") pod \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.939088 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f053300-30e6-48b6-b474-9e6cba4dbeb4-logs\") pod \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.939147 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-combined-ca-bundle\") pod \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\" (UID: \"2f053300-30e6-48b6-b474-9e6cba4dbeb4\") " Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.940745 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f053300-30e6-48b6-b474-9e6cba4dbeb4-logs" (OuterVolumeSpecName: "logs") pod "2f053300-30e6-48b6-b474-9e6cba4dbeb4" (UID: "2f053300-30e6-48b6-b474-9e6cba4dbeb4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.947910 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f053300-30e6-48b6-b474-9e6cba4dbeb4-kube-api-access-z689s" (OuterVolumeSpecName: "kube-api-access-z689s") pod "2f053300-30e6-48b6-b474-9e6cba4dbeb4" (UID: "2f053300-30e6-48b6-b474-9e6cba4dbeb4"). InnerVolumeSpecName "kube-api-access-z689s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:19 crc kubenswrapper[4658]: I1002 11:39:19.963315 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16004f96-9ecf-4206-adcc-c62ee45dca24" path="/var/lib/kubelet/pods/16004f96-9ecf-4206-adcc-c62ee45dca24/volumes" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.023229 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f053300-30e6-48b6-b474-9e6cba4dbeb4" (UID: "2f053300-30e6-48b6-b474-9e6cba4dbeb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.037612 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-config-data" (OuterVolumeSpecName: "config-data") pod "2f053300-30e6-48b6-b474-9e6cba4dbeb4" (UID: "2f053300-30e6-48b6-b474-9e6cba4dbeb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.040903 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f053300-30e6-48b6-b474-9e6cba4dbeb4-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.040944 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.040959 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z689s\" (UniqueName: \"kubernetes.io/projected/2f053300-30e6-48b6-b474-9e6cba4dbeb4-kube-api-access-z689s\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.040973 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f053300-30e6-48b6-b474-9e6cba4dbeb4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.537316 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d623c2ea-e4e8-4031-af93-35f76f08dba2","Type":"ContainerStarted","Data":"a0d555cefdd84a1c505ce599b9fefe07d76cbcd256a50e0e2b9c56864a30fa41"} Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.537861 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d623c2ea-e4e8-4031-af93-35f76f08dba2","Type":"ContainerStarted","Data":"e5474a31e7877827afb3e835cc0e1ddcada2375ac600781c0d1ca5da467d3957"} Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.537892 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.539913 4658 generic.go:334] "Generic (PLEG): container finished" podID="3d138ce0-7164-4e2f-9690-83719e55b301" containerID="2a485927051fda1e74c45cc634305f6bea335369dbb3237494fd65af75528a2b" exitCode=2 Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.539957 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d138ce0-7164-4e2f-9690-83719e55b301","Type":"ContainerDied","Data":"2a485927051fda1e74c45cc634305f6bea335369dbb3237494fd65af75528a2b"} Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.541743 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f053300-30e6-48b6-b474-9e6cba4dbeb4","Type":"ContainerDied","Data":"579cb0c19595e672d8ba89c592844444a1d971187aebbcfa5ca53148fc23629a"} Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.541814 4658 scope.go:117] "RemoveContainer" containerID="ab6d168301b817650aa67973968b696bc39642aff80bdad71f1f7dc2533d13d7" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.541951 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.577441 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9dede2bc-0f08-4ce1-8977-c5427f0ad52f","Type":"ContainerStarted","Data":"82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe"} Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.577507 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9dede2bc-0f08-4ce1-8977-c5427f0ad52f","Type":"ContainerStarted","Data":"be3fca7353312b5d874d06cdc15b13538ca04ef07250aa9a0a823de4c33c8af2"} Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.601107 4658 scope.go:117] "RemoveContainer" containerID="f7548f54c3767b0055a3b5321b62868f5624928bd8654d1e5a9409d5abce8619" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.673771 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.6737493519999997 podStartE2EDuration="2.673749352s" podCreationTimestamp="2025-10-02 11:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:20.583000952 +0000 UTC m=+1241.474154519" watchObservedRunningTime="2025-10-02 11:39:20.673749352 +0000 UTC m=+1241.564902929" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.719411 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.737662 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.753766 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:20 crc kubenswrapper[4658]: E1002 11:39:20.763462 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-log" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.763505 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-log" Oct 02 11:39:20 crc kubenswrapper[4658]: E1002 11:39:20.763518 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-api" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.764230 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-api" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.765818 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.765805054 podStartE2EDuration="2.765805054s" podCreationTimestamp="2025-10-02 11:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:20.662862937 +0000 UTC m=+1241.554016504" watchObservedRunningTime="2025-10-02 11:39:20.765805054 +0000 UTC m=+1241.656958621" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.766003 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-log" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.766045 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" containerName="nova-api-api" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.767497 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.772258 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.781978 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.820111 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.964812 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rq8s\" (UniqueName: \"kubernetes.io/projected/3d138ce0-7164-4e2f-9690-83719e55b301-kube-api-access-5rq8s\") pod \"3d138ce0-7164-4e2f-9690-83719e55b301\" (UID: \"3d138ce0-7164-4e2f-9690-83719e55b301\") " Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.967018 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-logs\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.967287 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2cqv\" (UniqueName: \"kubernetes.io/projected/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-kube-api-access-n2cqv\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.967430 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-config-data\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.967575 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:20 crc kubenswrapper[4658]: I1002 11:39:20.979136 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d138ce0-7164-4e2f-9690-83719e55b301-kube-api-access-5rq8s" (OuterVolumeSpecName: "kube-api-access-5rq8s") pod "3d138ce0-7164-4e2f-9690-83719e55b301" (UID: "3d138ce0-7164-4e2f-9690-83719e55b301"). InnerVolumeSpecName "kube-api-access-5rq8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.069479 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-logs\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.069895 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2cqv\" (UniqueName: \"kubernetes.io/projected/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-kube-api-access-n2cqv\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.069929 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-config-data\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.069978 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.070084 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rq8s\" (UniqueName: \"kubernetes.io/projected/3d138ce0-7164-4e2f-9690-83719e55b301-kube-api-access-5rq8s\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.070928 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-logs\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.074667 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-config-data\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.076609 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.091766 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2cqv\" (UniqueName: \"kubernetes.io/projected/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-kube-api-access-n2cqv\") pod \"nova-api-0\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " pod="openstack/nova-api-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.133161 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.589060 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.596777 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d138ce0-7164-4e2f-9690-83719e55b301","Type":"ContainerDied","Data":"ffbd2a528cc1d177d7f9a2f7321ba8730afd162b4bf0dea4dfde069809ba4491"} Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.596956 4658 scope.go:117] "RemoveContainer" containerID="2a485927051fda1e74c45cc634305f6bea335369dbb3237494fd65af75528a2b" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.609239 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.694983 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.711848 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.722855 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:39:21 crc kubenswrapper[4658]: E1002 11:39:21.723416 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d138ce0-7164-4e2f-9690-83719e55b301" containerName="kube-state-metrics" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.723444 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d138ce0-7164-4e2f-9690-83719e55b301" containerName="kube-state-metrics" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.723679 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d138ce0-7164-4e2f-9690-83719e55b301" containerName="kube-state-metrics" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.724548 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.726716 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.727117 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.736134 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.892024 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f67801c0-f438-43ae-a45b-c2870b64f553-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.892435 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67801c0-f438-43ae-a45b-c2870b64f553-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.892522 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67801c0-f438-43ae-a45b-c2870b64f553-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.892811 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-277jx\" (UniqueName: \"kubernetes.io/projected/f67801c0-f438-43ae-a45b-c2870b64f553-kube-api-access-277jx\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.969252 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f053300-30e6-48b6-b474-9e6cba4dbeb4" path="/var/lib/kubelet/pods/2f053300-30e6-48b6-b474-9e6cba4dbeb4/volumes" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.970005 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d138ce0-7164-4e2f-9690-83719e55b301" path="/var/lib/kubelet/pods/3d138ce0-7164-4e2f-9690-83719e55b301/volumes" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.990570 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.991177 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="ceilometer-central-agent" containerID="cri-o://3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af" gracePeriod=30 Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.991676 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="proxy-httpd" containerID="cri-o://c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f" gracePeriod=30 Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.991743 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="sg-core" containerID="cri-o://52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9" gracePeriod=30 Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.991796 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="ceilometer-notification-agent" containerID="cri-o://335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a" gracePeriod=30 Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.995743 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67801c0-f438-43ae-a45b-c2870b64f553-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.995851 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67801c0-f438-43ae-a45b-c2870b64f553-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.995905 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-277jx\" (UniqueName: \"kubernetes.io/projected/f67801c0-f438-43ae-a45b-c2870b64f553-kube-api-access-277jx\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:21 crc kubenswrapper[4658]: I1002 11:39:21.995958 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f67801c0-f438-43ae-a45b-c2870b64f553-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.002836 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f67801c0-f438-43ae-a45b-c2870b64f553-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.007912 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67801c0-f438-43ae-a45b-c2870b64f553-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.025668 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67801c0-f438-43ae-a45b-c2870b64f553-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.045091 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-277jx\" (UniqueName: \"kubernetes.io/projected/f67801c0-f438-43ae-a45b-c2870b64f553-kube-api-access-277jx\") pod \"kube-state-metrics-0\" (UID: \"f67801c0-f438-43ae-a45b-c2870b64f553\") " pod="openstack/kube-state-metrics-0" Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.090617 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.600362 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f","Type":"ContainerStarted","Data":"5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d"} Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.601010 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f","Type":"ContainerStarted","Data":"882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986"} Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.601029 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f","Type":"ContainerStarted","Data":"15cd65593cc8f5c4659c5ff07769c93fd67985898f7df364b30f36fa7c69a7fa"} Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.607646 4658 generic.go:334] "Generic (PLEG): container finished" podID="481baa7a-7d97-44dd-b038-47a2969e3124" containerID="c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f" exitCode=0 Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.607682 4658 generic.go:334] "Generic (PLEG): container finished" podID="481baa7a-7d97-44dd-b038-47a2969e3124" containerID="52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9" exitCode=2 Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.607705 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerDied","Data":"c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f"} Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.607753 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerDied","Data":"52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9"} Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.614213 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:39:22 crc kubenswrapper[4658]: W1002 11:39:22.629281 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf67801c0_f438_43ae_a45b_c2870b64f553.slice/crio-f51bb842a4ef1bde2152518f26f69d4703a67326e655624493924b85248084f4 WatchSource:0}: Error finding container f51bb842a4ef1bde2152518f26f69d4703a67326e655624493924b85248084f4: Status 404 returned error can't find the container with id f51bb842a4ef1bde2152518f26f69d4703a67326e655624493924b85248084f4 Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.632773 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:39:22 crc kubenswrapper[4658]: I1002 11:39:22.633230 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.63321124 podStartE2EDuration="2.63321124s" podCreationTimestamp="2025-10-02 11:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:22.622277995 +0000 UTC m=+1243.513431582" watchObservedRunningTime="2025-10-02 11:39:22.63321124 +0000 UTC m=+1243.524364807" Oct 02 11:39:23 crc kubenswrapper[4658]: I1002 11:39:23.618564 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f67801c0-f438-43ae-a45b-c2870b64f553","Type":"ContainerStarted","Data":"a418ffe2516378729466930857a25f7c799dcc62466cf6c392442b36ddac4270"} Oct 02 11:39:23 crc kubenswrapper[4658]: I1002 11:39:23.619397 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f67801c0-f438-43ae-a45b-c2870b64f553","Type":"ContainerStarted","Data":"f51bb842a4ef1bde2152518f26f69d4703a67326e655624493924b85248084f4"} Oct 02 11:39:23 crc kubenswrapper[4658]: I1002 11:39:23.620237 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 11:39:23 crc kubenswrapper[4658]: I1002 11:39:23.624093 4658 generic.go:334] "Generic (PLEG): container finished" podID="481baa7a-7d97-44dd-b038-47a2969e3124" containerID="3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af" exitCode=0 Oct 02 11:39:23 crc kubenswrapper[4658]: I1002 11:39:23.624138 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerDied","Data":"3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af"} Oct 02 11:39:23 crc kubenswrapper[4658]: I1002 11:39:23.645886 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.248809328 podStartE2EDuration="2.645861643s" podCreationTimestamp="2025-10-02 11:39:21 +0000 UTC" firstStartedPulling="2025-10-02 11:39:22.632384143 +0000 UTC m=+1243.523537710" lastFinishedPulling="2025-10-02 11:39:23.029436458 +0000 UTC m=+1243.920590025" observedRunningTime="2025-10-02 11:39:23.63992596 +0000 UTC m=+1244.531079537" watchObservedRunningTime="2025-10-02 11:39:23.645861643 +0000 UTC m=+1244.537015210" Oct 02 11:39:23 crc kubenswrapper[4658]: I1002 11:39:23.917769 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.070713 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.249908 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-log-httpd\") pod \"481baa7a-7d97-44dd-b038-47a2969e3124\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.249982 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-sg-core-conf-yaml\") pod \"481baa7a-7d97-44dd-b038-47a2969e3124\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.250096 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-config-data\") pod \"481baa7a-7d97-44dd-b038-47a2969e3124\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.250127 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wnqc\" (UniqueName: \"kubernetes.io/projected/481baa7a-7d97-44dd-b038-47a2969e3124-kube-api-access-2wnqc\") pod \"481baa7a-7d97-44dd-b038-47a2969e3124\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.250160 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-scripts\") pod \"481baa7a-7d97-44dd-b038-47a2969e3124\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.250254 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-combined-ca-bundle\") pod \"481baa7a-7d97-44dd-b038-47a2969e3124\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.250324 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-run-httpd\") pod \"481baa7a-7d97-44dd-b038-47a2969e3124\" (UID: \"481baa7a-7d97-44dd-b038-47a2969e3124\") " Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.251096 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "481baa7a-7d97-44dd-b038-47a2969e3124" (UID: "481baa7a-7d97-44dd-b038-47a2969e3124"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.251396 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "481baa7a-7d97-44dd-b038-47a2969e3124" (UID: "481baa7a-7d97-44dd-b038-47a2969e3124"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.258364 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481baa7a-7d97-44dd-b038-47a2969e3124-kube-api-access-2wnqc" (OuterVolumeSpecName: "kube-api-access-2wnqc") pod "481baa7a-7d97-44dd-b038-47a2969e3124" (UID: "481baa7a-7d97-44dd-b038-47a2969e3124"). InnerVolumeSpecName "kube-api-access-2wnqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.258392 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-scripts" (OuterVolumeSpecName: "scripts") pod "481baa7a-7d97-44dd-b038-47a2969e3124" (UID: "481baa7a-7d97-44dd-b038-47a2969e3124"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.285691 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "481baa7a-7d97-44dd-b038-47a2969e3124" (UID: "481baa7a-7d97-44dd-b038-47a2969e3124"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.336628 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "481baa7a-7d97-44dd-b038-47a2969e3124" (UID: "481baa7a-7d97-44dd-b038-47a2969e3124"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.353059 4658 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.353121 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wnqc\" (UniqueName: \"kubernetes.io/projected/481baa7a-7d97-44dd-b038-47a2969e3124-kube-api-access-2wnqc\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.353133 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.353142 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.353168 4658 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.353176 4658 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481baa7a-7d97-44dd-b038-47a2969e3124-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.360862 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-config-data" (OuterVolumeSpecName: "config-data") pod "481baa7a-7d97-44dd-b038-47a2969e3124" (UID: "481baa7a-7d97-44dd-b038-47a2969e3124"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.454674 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481baa7a-7d97-44dd-b038-47a2969e3124-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.637182 4658 generic.go:334] "Generic (PLEG): container finished" podID="481baa7a-7d97-44dd-b038-47a2969e3124" containerID="335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a" exitCode=0 Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.638504 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.640490 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerDied","Data":"335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a"} Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.640532 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481baa7a-7d97-44dd-b038-47a2969e3124","Type":"ContainerDied","Data":"a9dfb109de6163f92ba69046ccc22af7123ef0aaea67064b19b2d8de7e173ef9"} Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.640553 4658 scope.go:117] "RemoveContainer" containerID="c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.668948 4658 scope.go:117] "RemoveContainer" containerID="52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.685571 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.695091 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.709636 4658 scope.go:117] "RemoveContainer" containerID="335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.717115 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:24 crc kubenswrapper[4658]: E1002 11:39:24.717690 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="sg-core" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.717712 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="sg-core" Oct 02 11:39:24 crc kubenswrapper[4658]: E1002 11:39:24.717743 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="proxy-httpd" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.717751 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="proxy-httpd" Oct 02 11:39:24 crc kubenswrapper[4658]: E1002 11:39:24.717772 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="ceilometer-central-agent" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.717781 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="ceilometer-central-agent" Oct 02 11:39:24 crc kubenswrapper[4658]: E1002 11:39:24.717800 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="ceilometer-notification-agent" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.717807 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="ceilometer-notification-agent" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.718011 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="ceilometer-notification-agent" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.718030 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="proxy-httpd" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.718060 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="sg-core" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.719697 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" containerName="ceilometer-central-agent" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.722032 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.726139 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.726414 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.726596 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.729960 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.734440 4658 scope.go:117] "RemoveContainer" containerID="3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.764817 4658 scope.go:117] "RemoveContainer" containerID="c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f" Oct 02 11:39:24 crc kubenswrapper[4658]: E1002 11:39:24.765443 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f\": container with ID starting with c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f not found: ID does not exist" containerID="c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.765590 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f"} err="failed to get container status \"c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f\": rpc error: code = NotFound desc = could not find container \"c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f\": container with ID starting with c83940af5fe6a04a6660965d646707c2a8fe8ae273d2dfa15729d82b6ac45c9f not found: ID does not exist" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.765700 4658 scope.go:117] "RemoveContainer" containerID="52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9" Oct 02 11:39:24 crc kubenswrapper[4658]: E1002 11:39:24.766123 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9\": container with ID starting with 52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9 not found: ID does not exist" containerID="52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.766221 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9"} err="failed to get container status \"52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9\": rpc error: code = NotFound desc = could not find container \"52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9\": container with ID starting with 52d07aa63e140f2d891b58f958e58db21db387fa7b7e836638129389f1529db9 not found: ID does not exist" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.766355 4658 scope.go:117] "RemoveContainer" containerID="335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a" Oct 02 11:39:24 crc kubenswrapper[4658]: E1002 11:39:24.767171 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a\": container with ID starting with 335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a not found: ID does not exist" containerID="335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.767231 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a"} err="failed to get container status \"335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a\": rpc error: code = NotFound desc = could not find container \"335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a\": container with ID starting with 335eccf8e0900a79cb6998f964b3d5df560c159566964629b115c38ce8406e9a not found: ID does not exist" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.767269 4658 scope.go:117] "RemoveContainer" containerID="3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af" Oct 02 11:39:24 crc kubenswrapper[4658]: E1002 11:39:24.768005 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af\": container with ID starting with 3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af not found: ID does not exist" containerID="3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.768034 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af"} err="failed to get container status \"3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af\": rpc error: code = NotFound desc = could not find container \"3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af\": container with ID starting with 3fdd9c1d348a3511c685d69663ba38179db67a9c41efd75ef12a45e85135b2af not found: ID does not exist" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.860824 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.860911 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.861149 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnz6l\" (UniqueName: \"kubernetes.io/projected/ba97544e-6fe2-46c0-8225-29059e833283-kube-api-access-tnz6l\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.861236 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-run-httpd\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.861273 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.861308 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-log-httpd\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.861330 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-scripts\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.861513 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-config-data\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.964950 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.965117 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.965172 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnz6l\" (UniqueName: \"kubernetes.io/projected/ba97544e-6fe2-46c0-8225-29059e833283-kube-api-access-tnz6l\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.965206 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-run-httpd\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.965244 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.965272 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-log-httpd\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.965325 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-scripts\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.965382 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-config-data\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.966600 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-log-httpd\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.966747 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-run-httpd\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.971153 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.971154 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.972653 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.972674 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-scripts\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.973572 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-config-data\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:24 crc kubenswrapper[4658]: I1002 11:39:24.989969 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnz6l\" (UniqueName: \"kubernetes.io/projected/ba97544e-6fe2-46c0-8225-29059e833283-kube-api-access-tnz6l\") pod \"ceilometer-0\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " pod="openstack/ceilometer-0" Oct 02 11:39:25 crc kubenswrapper[4658]: I1002 11:39:25.040761 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:39:25 crc kubenswrapper[4658]: I1002 11:39:25.488693 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:25 crc kubenswrapper[4658]: W1002 11:39:25.491938 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba97544e_6fe2_46c0_8225_29059e833283.slice/crio-f2f561efd2379b6486e514d480aec9d9ae37526f0062f13cda96af537d265700 WatchSource:0}: Error finding container f2f561efd2379b6486e514d480aec9d9ae37526f0062f13cda96af537d265700: Status 404 returned error can't find the container with id f2f561efd2379b6486e514d480aec9d9ae37526f0062f13cda96af537d265700 Oct 02 11:39:25 crc kubenswrapper[4658]: I1002 11:39:25.649835 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerStarted","Data":"f2f561efd2379b6486e514d480aec9d9ae37526f0062f13cda96af537d265700"} Oct 02 11:39:25 crc kubenswrapper[4658]: I1002 11:39:25.964270 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481baa7a-7d97-44dd-b038-47a2969e3124" path="/var/lib/kubelet/pods/481baa7a-7d97-44dd-b038-47a2969e3124/volumes" Oct 02 11:39:26 crc kubenswrapper[4658]: I1002 11:39:26.662904 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerStarted","Data":"bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03"} Oct 02 11:39:27 crc kubenswrapper[4658]: I1002 11:39:27.680584 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerStarted","Data":"9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064"} Oct 02 11:39:28 crc kubenswrapper[4658]: I1002 11:39:28.691598 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerStarted","Data":"a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846"} Oct 02 11:39:28 crc kubenswrapper[4658]: I1002 11:39:28.918129 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:39:28 crc kubenswrapper[4658]: I1002 11:39:28.949671 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:39:28 crc kubenswrapper[4658]: I1002 11:39:28.980380 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 11:39:29 crc kubenswrapper[4658]: I1002 11:39:29.703005 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerStarted","Data":"78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d"} Oct 02 11:39:29 crc kubenswrapper[4658]: I1002 11:39:29.703349 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:39:29 crc kubenswrapper[4658]: I1002 11:39:29.726190 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.222438449 podStartE2EDuration="5.72617431s" podCreationTimestamp="2025-10-02 11:39:24 +0000 UTC" firstStartedPulling="2025-10-02 11:39:25.494457358 +0000 UTC m=+1246.385610925" lastFinishedPulling="2025-10-02 11:39:28.998193219 +0000 UTC m=+1249.889346786" observedRunningTime="2025-10-02 11:39:29.724504196 +0000 UTC m=+1250.615657763" watchObservedRunningTime="2025-10-02 11:39:29.72617431 +0000 UTC m=+1250.617327867" Oct 02 11:39:29 crc kubenswrapper[4658]: I1002 11:39:29.741605 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:39:31 crc kubenswrapper[4658]: I1002 11:39:31.134335 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:39:31 crc kubenswrapper[4658]: I1002 11:39:31.134840 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:39:32 crc kubenswrapper[4658]: I1002 11:39:32.100081 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 11:39:32 crc kubenswrapper[4658]: I1002 11:39:32.216492 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:39:32 crc kubenswrapper[4658]: I1002 11:39:32.217269 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.747390 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.753458 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.785949 4658 generic.go:334] "Generic (PLEG): container finished" podID="4099dbad-5133-4954-9bf5-1131c1d0164a" containerID="862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6" exitCode=137 Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.786021 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4099dbad-5133-4954-9bf5-1131c1d0164a","Type":"ContainerDied","Data":"862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6"} Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.786055 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4099dbad-5133-4954-9bf5-1131c1d0164a","Type":"ContainerDied","Data":"186b434da1fd392228694b359b4b121fef9e4f6f38ee395d5bac0e6169a98d3e"} Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.786077 4658 scope.go:117] "RemoveContainer" containerID="862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.786210 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.789131 4658 generic.go:334] "Generic (PLEG): container finished" podID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerID="4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1" exitCode=137 Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.789163 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d3f3b46-3130-49ca-a5a0-67bcf53277d6","Type":"ContainerDied","Data":"4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1"} Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.789184 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d3f3b46-3130-49ca-a5a0-67bcf53277d6","Type":"ContainerDied","Data":"4a98143d8d4019ec3eebd6324c3de13b88ac48b46d75b6fefaa09b095de9388c"} Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.789232 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.814518 4658 scope.go:117] "RemoveContainer" containerID="862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6" Oct 02 11:39:36 crc kubenswrapper[4658]: E1002 11:39:36.815921 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6\": container with ID starting with 862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6 not found: ID does not exist" containerID="862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.815964 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6"} err="failed to get container status \"862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6\": rpc error: code = NotFound desc = could not find container \"862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6\": container with ID starting with 862e3025b3a2e9d3a0d5fa8ac948c7a38d79d29a7e72ae1dc119ed113f0cb1b6 not found: ID does not exist" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.815988 4658 scope.go:117] "RemoveContainer" containerID="4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.838719 4658 scope.go:117] "RemoveContainer" containerID="bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.860467 4658 scope.go:117] "RemoveContainer" containerID="4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1" Oct 02 11:39:36 crc kubenswrapper[4658]: E1002 11:39:36.860870 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1\": container with ID starting with 4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1 not found: ID does not exist" containerID="4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.860897 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1"} err="failed to get container status \"4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1\": rpc error: code = NotFound desc = could not find container \"4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1\": container with ID starting with 4b0ed34c432d0590dcf048e84207e0ba4c2e34b25d03e00407a38a45e5c7b8c1 not found: ID does not exist" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.860917 4658 scope.go:117] "RemoveContainer" containerID="bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394" Oct 02 11:39:36 crc kubenswrapper[4658]: E1002 11:39:36.861207 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394\": container with ID starting with bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394 not found: ID does not exist" containerID="bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.861253 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394"} err="failed to get container status \"bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394\": rpc error: code = NotFound desc = could not find container \"bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394\": container with ID starting with bb01ee70d5a59acdf534cfc43a040845f654b824f70aab8077cbbd32f0a5c394 not found: ID does not exist" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.905241 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-combined-ca-bundle\") pod \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.905425 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tdz9\" (UniqueName: \"kubernetes.io/projected/4099dbad-5133-4954-9bf5-1131c1d0164a-kube-api-access-6tdz9\") pod \"4099dbad-5133-4954-9bf5-1131c1d0164a\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.905511 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfgsg\" (UniqueName: \"kubernetes.io/projected/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-kube-api-access-pfgsg\") pod \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.905539 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-logs\") pod \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.905659 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-config-data\") pod \"4099dbad-5133-4954-9bf5-1131c1d0164a\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.905730 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-combined-ca-bundle\") pod \"4099dbad-5133-4954-9bf5-1131c1d0164a\" (UID: \"4099dbad-5133-4954-9bf5-1131c1d0164a\") " Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.905749 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-config-data\") pod \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\" (UID: \"7d3f3b46-3130-49ca-a5a0-67bcf53277d6\") " Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.906158 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-logs" (OuterVolumeSpecName: "logs") pod "7d3f3b46-3130-49ca-a5a0-67bcf53277d6" (UID: "7d3f3b46-3130-49ca-a5a0-67bcf53277d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.906874 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.914526 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4099dbad-5133-4954-9bf5-1131c1d0164a-kube-api-access-6tdz9" (OuterVolumeSpecName: "kube-api-access-6tdz9") pod "4099dbad-5133-4954-9bf5-1131c1d0164a" (UID: "4099dbad-5133-4954-9bf5-1131c1d0164a"). InnerVolumeSpecName "kube-api-access-6tdz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.925038 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-kube-api-access-pfgsg" (OuterVolumeSpecName: "kube-api-access-pfgsg") pod "7d3f3b46-3130-49ca-a5a0-67bcf53277d6" (UID: "7d3f3b46-3130-49ca-a5a0-67bcf53277d6"). InnerVolumeSpecName "kube-api-access-pfgsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.934588 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4099dbad-5133-4954-9bf5-1131c1d0164a" (UID: "4099dbad-5133-4954-9bf5-1131c1d0164a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.935990 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-config-data" (OuterVolumeSpecName: "config-data") pod "4099dbad-5133-4954-9bf5-1131c1d0164a" (UID: "4099dbad-5133-4954-9bf5-1131c1d0164a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.936010 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d3f3b46-3130-49ca-a5a0-67bcf53277d6" (UID: "7d3f3b46-3130-49ca-a5a0-67bcf53277d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:36 crc kubenswrapper[4658]: I1002 11:39:36.944648 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-config-data" (OuterVolumeSpecName: "config-data") pod "7d3f3b46-3130-49ca-a5a0-67bcf53277d6" (UID: "7d3f3b46-3130-49ca-a5a0-67bcf53277d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.008045 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfgsg\" (UniqueName: \"kubernetes.io/projected/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-kube-api-access-pfgsg\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.008081 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.008093 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4099dbad-5133-4954-9bf5-1131c1d0164a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.008102 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.008111 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3f3b46-3130-49ca-a5a0-67bcf53277d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.008119 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tdz9\" (UniqueName: \"kubernetes.io/projected/4099dbad-5133-4954-9bf5-1131c1d0164a-kube-api-access-6tdz9\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.127428 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.140047 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.166959 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.193901 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.193974 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:39:37 crc kubenswrapper[4658]: E1002 11:39:37.194348 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerName="nova-metadata-log" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.194361 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerName="nova-metadata-log" Oct 02 11:39:37 crc kubenswrapper[4658]: E1002 11:39:37.194384 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerName="nova-metadata-metadata" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.194391 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerName="nova-metadata-metadata" Oct 02 11:39:37 crc kubenswrapper[4658]: E1002 11:39:37.194427 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4099dbad-5133-4954-9bf5-1131c1d0164a" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.194433 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4099dbad-5133-4954-9bf5-1131c1d0164a" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.194646 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4099dbad-5133-4954-9bf5-1131c1d0164a" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.194664 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerName="nova-metadata-log" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.194680 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" containerName="nova-metadata-metadata" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.195308 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.195383 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.205937 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.208598 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.213723 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.214547 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.214672 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.214777 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.214927 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.222377 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.316755 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz8pq\" (UniqueName: \"kubernetes.io/projected/7a2443d8-6a60-4b09-82d5-c3fe639cb819-kube-api-access-zz8pq\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.316796 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdzkm\" (UniqueName: \"kubernetes.io/projected/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-kube-api-access-qdzkm\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.316829 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.316879 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2443d8-6a60-4b09-82d5-c3fe639cb819-logs\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.316893 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.316910 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-config-data\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.316951 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.316992 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.317025 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.317052 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.418797 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.419250 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz8pq\" (UniqueName: \"kubernetes.io/projected/7a2443d8-6a60-4b09-82d5-c3fe639cb819-kube-api-access-zz8pq\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.419702 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdzkm\" (UniqueName: \"kubernetes.io/projected/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-kube-api-access-qdzkm\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.419783 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.420274 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2443d8-6a60-4b09-82d5-c3fe639cb819-logs\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.420311 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.420328 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-config-data\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.420364 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.420409 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.420442 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.420932 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2443d8-6a60-4b09-82d5-c3fe639cb819-logs\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.422840 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.423213 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.423436 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.424762 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.424886 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.425144 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-config-data\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.425315 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.435372 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdzkm\" (UniqueName: \"kubernetes.io/projected/7e69ac9b-be4b-4d88-bf64-06f4ca3966ba-kube-api-access-qdzkm\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.436154 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz8pq\" (UniqueName: \"kubernetes.io/projected/7a2443d8-6a60-4b09-82d5-c3fe639cb819-kube-api-access-zz8pq\") pod \"nova-metadata-0\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.530174 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.544940 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.964571 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4099dbad-5133-4954-9bf5-1131c1d0164a" path="/var/lib/kubelet/pods/4099dbad-5133-4954-9bf5-1131c1d0164a/volumes" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.965806 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3f3b46-3130-49ca-a5a0-67bcf53277d6" path="/var/lib/kubelet/pods/7d3f3b46-3130-49ca-a5a0-67bcf53277d6/volumes" Oct 02 11:39:37 crc kubenswrapper[4658]: I1002 11:39:37.986764 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:39:37 crc kubenswrapper[4658]: W1002 11:39:37.989747 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e69ac9b_be4b_4d88_bf64_06f4ca3966ba.slice/crio-9f0067ffcfaeadc80e1076fbb5217435da90efd6cf85f9a34fbb08aceb34fe6c WatchSource:0}: Error finding container 9f0067ffcfaeadc80e1076fbb5217435da90efd6cf85f9a34fbb08aceb34fe6c: Status 404 returned error can't find the container with id 9f0067ffcfaeadc80e1076fbb5217435da90efd6cf85f9a34fbb08aceb34fe6c Oct 02 11:39:38 crc kubenswrapper[4658]: I1002 11:39:38.080398 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:38 crc kubenswrapper[4658]: I1002 11:39:38.816035 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a2443d8-6a60-4b09-82d5-c3fe639cb819","Type":"ContainerStarted","Data":"a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91"} Oct 02 11:39:38 crc kubenswrapper[4658]: I1002 11:39:38.816343 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a2443d8-6a60-4b09-82d5-c3fe639cb819","Type":"ContainerStarted","Data":"3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5"} Oct 02 11:39:38 crc kubenswrapper[4658]: I1002 11:39:38.816358 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a2443d8-6a60-4b09-82d5-c3fe639cb819","Type":"ContainerStarted","Data":"405c988b32dd99a458f8f293eb30874b4bc6eee4b232c4d299f50d422d51f6a1"} Oct 02 11:39:38 crc kubenswrapper[4658]: I1002 11:39:38.818539 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba","Type":"ContainerStarted","Data":"93accb391b8bd78b200913c831144621b8275cf84ab61c734cda6f1ca593fa09"} Oct 02 11:39:38 crc kubenswrapper[4658]: I1002 11:39:38.818571 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7e69ac9b-be4b-4d88-bf64-06f4ca3966ba","Type":"ContainerStarted","Data":"9f0067ffcfaeadc80e1076fbb5217435da90efd6cf85f9a34fbb08aceb34fe6c"} Oct 02 11:39:38 crc kubenswrapper[4658]: I1002 11:39:38.841211 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.841186552 podStartE2EDuration="1.841186552s" podCreationTimestamp="2025-10-02 11:39:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:38.835784097 +0000 UTC m=+1259.726937684" watchObservedRunningTime="2025-10-02 11:39:38.841186552 +0000 UTC m=+1259.732340119" Oct 02 11:39:41 crc kubenswrapper[4658]: I1002 11:39:41.137800 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:39:41 crc kubenswrapper[4658]: I1002 11:39:41.138572 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:39:41 crc kubenswrapper[4658]: I1002 11:39:41.141661 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:39:41 crc kubenswrapper[4658]: I1002 11:39:41.141880 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:39:41 crc kubenswrapper[4658]: I1002 11:39:41.160588 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.160571449 podStartE2EDuration="4.160571449s" podCreationTimestamp="2025-10-02 11:39:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:38.862441644 +0000 UTC m=+1259.753595221" watchObservedRunningTime="2025-10-02 11:39:41.160571449 +0000 UTC m=+1262.051725016" Oct 02 11:39:41 crc kubenswrapper[4658]: I1002 11:39:41.848286 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:39:41 crc kubenswrapper[4658]: I1002 11:39:41.853010 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.013620 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fd56n"] Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.016772 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.031927 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fd56n"] Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.131933 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.131992 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdcfw\" (UniqueName: \"kubernetes.io/projected/0a3249d7-8466-4f22-ba34-a4d6533e1de4-kube-api-access-sdcfw\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.132197 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-config\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.132267 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.132290 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.132892 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.234764 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.234805 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdcfw\" (UniqueName: \"kubernetes.io/projected/0a3249d7-8466-4f22-ba34-a4d6533e1de4-kube-api-access-sdcfw\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.234846 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-config\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.234886 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.234905 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.234981 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.235870 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.236243 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.236470 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.236940 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.238023 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-config\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.261376 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdcfw\" (UniqueName: \"kubernetes.io/projected/0a3249d7-8466-4f22-ba34-a4d6533e1de4-kube-api-access-sdcfw\") pod \"dnsmasq-dns-59cf4bdb65-fd56n\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.339911 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.531242 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.546405 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.547260 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.687761 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fd56n"] Oct 02 11:39:42 crc kubenswrapper[4658]: W1002 11:39:42.695617 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3249d7_8466_4f22_ba34_a4d6533e1de4.slice/crio-4b8d2ccb543adb1f6116aa4b8a8afb33388dba72fd61aa94045c4911ea6aff42 WatchSource:0}: Error finding container 4b8d2ccb543adb1f6116aa4b8a8afb33388dba72fd61aa94045c4911ea6aff42: Status 404 returned error can't find the container with id 4b8d2ccb543adb1f6116aa4b8a8afb33388dba72fd61aa94045c4911ea6aff42 Oct 02 11:39:42 crc kubenswrapper[4658]: I1002 11:39:42.858349 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" event={"ID":"0a3249d7-8466-4f22-ba34-a4d6533e1de4","Type":"ContainerStarted","Data":"4b8d2ccb543adb1f6116aa4b8a8afb33388dba72fd61aa94045c4911ea6aff42"} Oct 02 11:39:43 crc kubenswrapper[4658]: I1002 11:39:43.870503 4658 generic.go:334] "Generic (PLEG): container finished" podID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" containerID="0b531ab7d9624e0654f1ab4dd15e708611fb16825891ff85c69fbf1c0579ba63" exitCode=0 Oct 02 11:39:43 crc kubenswrapper[4658]: I1002 11:39:43.870601 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" event={"ID":"0a3249d7-8466-4f22-ba34-a4d6533e1de4","Type":"ContainerDied","Data":"0b531ab7d9624e0654f1ab4dd15e708611fb16825891ff85c69fbf1c0579ba63"} Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.158420 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.159062 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="ceilometer-central-agent" containerID="cri-o://bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03" gracePeriod=30 Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.159133 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="proxy-httpd" containerID="cri-o://78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d" gracePeriod=30 Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.159183 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="ceilometer-notification-agent" containerID="cri-o://9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064" gracePeriod=30 Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.159190 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="sg-core" containerID="cri-o://a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846" gracePeriod=30 Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.176321 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.211:3000/\": EOF" Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.358552 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.889398 4658 generic.go:334] "Generic (PLEG): container finished" podID="ba97544e-6fe2-46c0-8225-29059e833283" containerID="78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d" exitCode=0 Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.889670 4658 generic.go:334] "Generic (PLEG): container finished" podID="ba97544e-6fe2-46c0-8225-29059e833283" containerID="a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846" exitCode=2 Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.889680 4658 generic.go:334] "Generic (PLEG): container finished" podID="ba97544e-6fe2-46c0-8225-29059e833283" containerID="bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03" exitCode=0 Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.889718 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerDied","Data":"78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d"} Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.889742 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerDied","Data":"a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846"} Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.889752 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerDied","Data":"bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03"} Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.893813 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-log" containerID="cri-o://882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986" gracePeriod=30 Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.895208 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" event={"ID":"0a3249d7-8466-4f22-ba34-a4d6533e1de4","Type":"ContainerStarted","Data":"723d3ae1b72510b7c1e6654dd8d2ab655ec4633e7062a03758012a249d06f7d6"} Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.895255 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-api" containerID="cri-o://5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d" gracePeriod=30 Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.895465 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:44 crc kubenswrapper[4658]: I1002 11:39:44.927882 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" podStartSLOduration=3.927864216 podStartE2EDuration="3.927864216s" podCreationTimestamp="2025-10-02 11:39:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:44.921719406 +0000 UTC m=+1265.812872993" watchObservedRunningTime="2025-10-02 11:39:44.927864216 +0000 UTC m=+1265.819017783" Oct 02 11:39:45 crc kubenswrapper[4658]: I1002 11:39:45.905370 4658 generic.go:334] "Generic (PLEG): container finished" podID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerID="882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986" exitCode=143 Oct 02 11:39:45 crc kubenswrapper[4658]: I1002 11:39:45.905441 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f","Type":"ContainerDied","Data":"882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986"} Oct 02 11:39:47 crc kubenswrapper[4658]: I1002 11:39:47.530426 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:47 crc kubenswrapper[4658]: I1002 11:39:47.546160 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:39:47 crc kubenswrapper[4658]: I1002 11:39:47.546210 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:39:47 crc kubenswrapper[4658]: I1002 11:39:47.550646 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:47 crc kubenswrapper[4658]: I1002 11:39:47.961936 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.140564 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fvg7b"] Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.142078 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.145585 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.145790 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.160328 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fvg7b"] Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.252610 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-scripts\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.252733 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lb48\" (UniqueName: \"kubernetes.io/projected/8bf757ce-6767-4bed-98a4-394baf2cc6f8-kube-api-access-9lb48\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.252769 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-config-data\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.252857 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.356674 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lb48\" (UniqueName: \"kubernetes.io/projected/8bf757ce-6767-4bed-98a4-394baf2cc6f8-kube-api-access-9lb48\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.357186 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-config-data\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.357327 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.357975 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-scripts\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.370212 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-scripts\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.370406 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-config-data\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.378730 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.382547 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lb48\" (UniqueName: \"kubernetes.io/projected/8bf757ce-6767-4bed-98a4-394baf2cc6f8-kube-api-access-9lb48\") pod \"nova-cell1-cell-mapping-fvg7b\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.575968 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.580639 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.580905 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.589844 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.658979 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.672933 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2cqv\" (UniqueName: \"kubernetes.io/projected/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-kube-api-access-n2cqv\") pod \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673002 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-ceilometer-tls-certs\") pod \"ba97544e-6fe2-46c0-8225-29059e833283\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673099 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-sg-core-conf-yaml\") pod \"ba97544e-6fe2-46c0-8225-29059e833283\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673147 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-run-httpd\") pod \"ba97544e-6fe2-46c0-8225-29059e833283\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673173 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-combined-ca-bundle\") pod \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673243 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-config-data\") pod \"ba97544e-6fe2-46c0-8225-29059e833283\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673319 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-combined-ca-bundle\") pod \"ba97544e-6fe2-46c0-8225-29059e833283\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673399 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnz6l\" (UniqueName: \"kubernetes.io/projected/ba97544e-6fe2-46c0-8225-29059e833283-kube-api-access-tnz6l\") pod \"ba97544e-6fe2-46c0-8225-29059e833283\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673428 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-logs\") pod \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673451 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-log-httpd\") pod \"ba97544e-6fe2-46c0-8225-29059e833283\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673515 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-config-data\") pod \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.673547 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-scripts\") pod \"ba97544e-6fe2-46c0-8225-29059e833283\" (UID: \"ba97544e-6fe2-46c0-8225-29059e833283\") " Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.674357 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-logs" (OuterVolumeSpecName: "logs") pod "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" (UID: "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.675623 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ba97544e-6fe2-46c0-8225-29059e833283" (UID: "ba97544e-6fe2-46c0-8225-29059e833283"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.679696 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-scripts" (OuterVolumeSpecName: "scripts") pod "ba97544e-6fe2-46c0-8225-29059e833283" (UID: "ba97544e-6fe2-46c0-8225-29059e833283"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.682123 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ba97544e-6fe2-46c0-8225-29059e833283" (UID: "ba97544e-6fe2-46c0-8225-29059e833283"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.696150 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba97544e-6fe2-46c0-8225-29059e833283-kube-api-access-tnz6l" (OuterVolumeSpecName: "kube-api-access-tnz6l") pod "ba97544e-6fe2-46c0-8225-29059e833283" (UID: "ba97544e-6fe2-46c0-8225-29059e833283"). InnerVolumeSpecName "kube-api-access-tnz6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.720687 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-kube-api-access-n2cqv" (OuterVolumeSpecName: "kube-api-access-n2cqv") pod "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" (UID: "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f"). InnerVolumeSpecName "kube-api-access-n2cqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.737818 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ba97544e-6fe2-46c0-8225-29059e833283" (UID: "ba97544e-6fe2-46c0-8225-29059e833283"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.777495 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-config-data" (OuterVolumeSpecName: "config-data") pod "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" (UID: "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.780738 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-config-data\") pod \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\" (UID: \"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f\") " Oct 02 11:39:48 crc kubenswrapper[4658]: W1002 11:39:48.780907 4658 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f/volumes/kubernetes.io~secret/config-data Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.780926 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-config-data" (OuterVolumeSpecName: "config-data") pod "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" (UID: "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.788078 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" (UID: "ffdf2ad0-ad63-4086-9563-5fa20aacaf4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.790712 4658 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.790741 4658 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.790750 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.790760 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnz6l\" (UniqueName: \"kubernetes.io/projected/ba97544e-6fe2-46c0-8225-29059e833283-kube-api-access-tnz6l\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.790771 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.790779 4658 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba97544e-6fe2-46c0-8225-29059e833283-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.790787 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.790795 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.790803 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2cqv\" (UniqueName: \"kubernetes.io/projected/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f-kube-api-access-n2cqv\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.806635 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ba97544e-6fe2-46c0-8225-29059e833283" (UID: "ba97544e-6fe2-46c0-8225-29059e833283"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.838180 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba97544e-6fe2-46c0-8225-29059e833283" (UID: "ba97544e-6fe2-46c0-8225-29059e833283"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.893610 4658 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.893872 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.915453 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-config-data" (OuterVolumeSpecName: "config-data") pod "ba97544e-6fe2-46c0-8225-29059e833283" (UID: "ba97544e-6fe2-46c0-8225-29059e833283"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.963175 4658 generic.go:334] "Generic (PLEG): container finished" podID="ba97544e-6fe2-46c0-8225-29059e833283" containerID="9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064" exitCode=0 Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.963246 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerDied","Data":"9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064"} Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.963274 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba97544e-6fe2-46c0-8225-29059e833283","Type":"ContainerDied","Data":"f2f561efd2379b6486e514d480aec9d9ae37526f0062f13cda96af537d265700"} Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.963304 4658 scope.go:117] "RemoveContainer" containerID="78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.963439 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.974247 4658 generic.go:334] "Generic (PLEG): container finished" podID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerID="5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d" exitCode=0 Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.974355 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.974379 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f","Type":"ContainerDied","Data":"5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d"} Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.974419 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffdf2ad0-ad63-4086-9563-5fa20aacaf4f","Type":"ContainerDied","Data":"15cd65593cc8f5c4659c5ff07769c93fd67985898f7df364b30f36fa7c69a7fa"} Oct 02 11:39:48 crc kubenswrapper[4658]: I1002 11:39:48.995476 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba97544e-6fe2-46c0-8225-29059e833283-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.017520 4658 scope.go:117] "RemoveContainer" containerID="a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.022883 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.045029 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056061 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.056538 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-log" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056556 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-log" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.056575 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-api" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056581 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-api" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.056593 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="proxy-httpd" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056599 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="proxy-httpd" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.056618 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="ceilometer-notification-agent" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056624 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="ceilometer-notification-agent" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.056642 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="ceilometer-central-agent" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056647 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="ceilometer-central-agent" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.056661 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="sg-core" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056666 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="sg-core" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056841 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-log" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056859 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="ceilometer-central-agent" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056872 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="sg-core" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056882 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" containerName="nova-api-api" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056892 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="ceilometer-notification-agent" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.056905 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba97544e-6fe2-46c0-8225-29059e833283" containerName="proxy-httpd" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.061789 4658 scope.go:117] "RemoveContainer" containerID="9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.066008 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.079441 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.081785 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.081957 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.082034 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.090232 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.095141 4658 scope.go:117] "RemoveContainer" containerID="bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.097061 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdmgt\" (UniqueName: \"kubernetes.io/projected/4ef5828e-3cb4-4a6d-ba04-f474234450d3-kube-api-access-tdmgt\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.097172 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-config-data\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.097194 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.097491 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.097565 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.097601 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef5828e-3cb4-4a6d-ba04-f474234450d3-log-httpd\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.097683 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-scripts\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.097822 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef5828e-3cb4-4a6d-ba04-f474234450d3-run-httpd\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.100752 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.120474 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.123096 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.128872 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.129197 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.129412 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.144748 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.173156 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fvg7b"] Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199154 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b42e9b-71a4-4136-8d7b-2922065b2fde-logs\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199239 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef5828e-3cb4-4a6d-ba04-f474234450d3-run-httpd\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199282 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-config-data\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199423 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199466 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdmgt\" (UniqueName: \"kubernetes.io/projected/4ef5828e-3cb4-4a6d-ba04-f474234450d3-kube-api-access-tdmgt\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199544 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-config-data\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199577 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199611 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mpf6\" (UniqueName: \"kubernetes.io/projected/e7b42e9b-71a4-4136-8d7b-2922065b2fde-kube-api-access-9mpf6\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199649 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199676 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-public-tls-certs\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199722 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199750 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef5828e-3cb4-4a6d-ba04-f474234450d3-log-httpd\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199779 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-scripts\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.199819 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.200215 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef5828e-3cb4-4a6d-ba04-f474234450d3-run-httpd\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.204730 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef5828e-3cb4-4a6d-ba04-f474234450d3-log-httpd\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.212832 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.213774 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.214258 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-scripts\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.222087 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-config-data\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.223543 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef5828e-3cb4-4a6d-ba04-f474234450d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.231092 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdmgt\" (UniqueName: \"kubernetes.io/projected/4ef5828e-3cb4-4a6d-ba04-f474234450d3-kube-api-access-tdmgt\") pod \"ceilometer-0\" (UID: \"4ef5828e-3cb4-4a6d-ba04-f474234450d3\") " pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.301525 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-public-tls-certs\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.301630 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.301662 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b42e9b-71a4-4136-8d7b-2922065b2fde-logs\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.301704 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-config-data\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.301731 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.301811 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mpf6\" (UniqueName: \"kubernetes.io/projected/e7b42e9b-71a4-4136-8d7b-2922065b2fde-kube-api-access-9mpf6\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.303515 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b42e9b-71a4-4136-8d7b-2922065b2fde-logs\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.307627 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-public-tls-certs\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.307771 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.310433 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-config-data\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.320594 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.321473 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mpf6\" (UniqueName: \"kubernetes.io/projected/e7b42e9b-71a4-4136-8d7b-2922065b2fde-kube-api-access-9mpf6\") pod \"nova-api-0\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.410443 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.418038 4658 scope.go:117] "RemoveContainer" containerID="78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.418602 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d\": container with ID starting with 78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d not found: ID does not exist" containerID="78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.418724 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d"} err="failed to get container status \"78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d\": rpc error: code = NotFound desc = could not find container \"78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d\": container with ID starting with 78d2d79a4b996d09b0a62495065573b1b975468201975d4967b3e1a1aebfc46d not found: ID does not exist" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.418818 4658 scope.go:117] "RemoveContainer" containerID="a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.419226 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846\": container with ID starting with a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846 not found: ID does not exist" containerID="a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.419342 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846"} err="failed to get container status \"a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846\": rpc error: code = NotFound desc = could not find container \"a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846\": container with ID starting with a73b859b80ddced82ff270de4825bf857fa5d69826879f91dc89bb64391dc846 not found: ID does not exist" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.419414 4658 scope.go:117] "RemoveContainer" containerID="9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.420113 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064\": container with ID starting with 9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064 not found: ID does not exist" containerID="9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.420146 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064"} err="failed to get container status \"9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064\": rpc error: code = NotFound desc = could not find container \"9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064\": container with ID starting with 9839b3ed73c39e37962e90609779366dff7b2287fed81371c57cd8f84f756064 not found: ID does not exist" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.420166 4658 scope.go:117] "RemoveContainer" containerID="bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.420819 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03\": container with ID starting with bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03 not found: ID does not exist" containerID="bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.420855 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03"} err="failed to get container status \"bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03\": rpc error: code = NotFound desc = could not find container \"bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03\": container with ID starting with bbc26ecc12c6dde53c4958d1bacf3f4716342b309337bb5b147e842dfca67c03 not found: ID does not exist" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.420873 4658 scope.go:117] "RemoveContainer" containerID="5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.492612 4658 scope.go:117] "RemoveContainer" containerID="882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.498930 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.533413 4658 scope.go:117] "RemoveContainer" containerID="5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.535452 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d\": container with ID starting with 5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d not found: ID does not exist" containerID="5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.535489 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d"} err="failed to get container status \"5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d\": rpc error: code = NotFound desc = could not find container \"5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d\": container with ID starting with 5452b46fa50375d8ec8d4601f28cd032c5bdebbc4cc7a5af59a9de3bbc3c964d not found: ID does not exist" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.535519 4658 scope.go:117] "RemoveContainer" containerID="882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986" Oct 02 11:39:49 crc kubenswrapper[4658]: E1002 11:39:49.538220 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986\": container with ID starting with 882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986 not found: ID does not exist" containerID="882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.538269 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986"} err="failed to get container status \"882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986\": rpc error: code = NotFound desc = could not find container \"882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986\": container with ID starting with 882232650e244668b0bd83d99b873d08522298ca8905355b8f8bf78a47d8e986 not found: ID does not exist" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.945285 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:39:49 crc kubenswrapper[4658]: W1002 11:39:49.948515 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ef5828e_3cb4_4a6d_ba04_f474234450d3.slice/crio-27f55ea60c4bb727170f0d51944efad8a7d0b209ac3eab597a6eb71a6be59b73 WatchSource:0}: Error finding container 27f55ea60c4bb727170f0d51944efad8a7d0b209ac3eab597a6eb71a6be59b73: Status 404 returned error can't find the container with id 27f55ea60c4bb727170f0d51944efad8a7d0b209ac3eab597a6eb71a6be59b73 Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.967675 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba97544e-6fe2-46c0-8225-29059e833283" path="/var/lib/kubelet/pods/ba97544e-6fe2-46c0-8225-29059e833283/volumes" Oct 02 11:39:49 crc kubenswrapper[4658]: I1002 11:39:49.968710 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffdf2ad0-ad63-4086-9563-5fa20aacaf4f" path="/var/lib/kubelet/pods/ffdf2ad0-ad63-4086-9563-5fa20aacaf4f/volumes" Oct 02 11:39:50 crc kubenswrapper[4658]: I1002 11:39:50.011581 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fvg7b" event={"ID":"8bf757ce-6767-4bed-98a4-394baf2cc6f8","Type":"ContainerStarted","Data":"4f1378482fc5e3aa5b781216f0c7930df6346459804584d54143aa11ee81ebe8"} Oct 02 11:39:50 crc kubenswrapper[4658]: I1002 11:39:50.011750 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fvg7b" event={"ID":"8bf757ce-6767-4bed-98a4-394baf2cc6f8","Type":"ContainerStarted","Data":"ef7c7d977fa3c43a18883009db2e95bf80e19a6478000f4c95b3c7bd8677f31e"} Oct 02 11:39:50 crc kubenswrapper[4658]: I1002 11:39:50.017054 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef5828e-3cb4-4a6d-ba04-f474234450d3","Type":"ContainerStarted","Data":"27f55ea60c4bb727170f0d51944efad8a7d0b209ac3eab597a6eb71a6be59b73"} Oct 02 11:39:50 crc kubenswrapper[4658]: I1002 11:39:50.029718 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fvg7b" podStartSLOduration=2.029695439 podStartE2EDuration="2.029695439s" podCreationTimestamp="2025-10-02 11:39:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:50.027714674 +0000 UTC m=+1270.918868241" watchObservedRunningTime="2025-10-02 11:39:50.029695439 +0000 UTC m=+1270.920849006" Oct 02 11:39:50 crc kubenswrapper[4658]: I1002 11:39:50.062161 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:50 crc kubenswrapper[4658]: W1002 11:39:50.066737 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7b42e9b_71a4_4136_8d7b_2922065b2fde.slice/crio-80c8a8ce1bb1e4bc3da3ae2355f5af019657a42bcf7174c845f624455e49cd49 WatchSource:0}: Error finding container 80c8a8ce1bb1e4bc3da3ae2355f5af019657a42bcf7174c845f624455e49cd49: Status 404 returned error can't find the container with id 80c8a8ce1bb1e4bc3da3ae2355f5af019657a42bcf7174c845f624455e49cd49 Oct 02 11:39:51 crc kubenswrapper[4658]: I1002 11:39:51.029553 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef5828e-3cb4-4a6d-ba04-f474234450d3","Type":"ContainerStarted","Data":"ab65923002f0f0cdbd62cfef1718d8087d4784719327b21ed86dfe67c3a966f6"} Oct 02 11:39:51 crc kubenswrapper[4658]: I1002 11:39:51.032357 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7b42e9b-71a4-4136-8d7b-2922065b2fde","Type":"ContainerStarted","Data":"25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0"} Oct 02 11:39:51 crc kubenswrapper[4658]: I1002 11:39:51.032478 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7b42e9b-71a4-4136-8d7b-2922065b2fde","Type":"ContainerStarted","Data":"9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e"} Oct 02 11:39:51 crc kubenswrapper[4658]: I1002 11:39:51.032950 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7b42e9b-71a4-4136-8d7b-2922065b2fde","Type":"ContainerStarted","Data":"80c8a8ce1bb1e4bc3da3ae2355f5af019657a42bcf7174c845f624455e49cd49"} Oct 02 11:39:51 crc kubenswrapper[4658]: I1002 11:39:51.052838 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.052820863 podStartE2EDuration="2.052820863s" podCreationTimestamp="2025-10-02 11:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:39:51.05240866 +0000 UTC m=+1271.943562227" watchObservedRunningTime="2025-10-02 11:39:51.052820863 +0000 UTC m=+1271.943974430" Oct 02 11:39:52 crc kubenswrapper[4658]: I1002 11:39:52.048212 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef5828e-3cb4-4a6d-ba04-f474234450d3","Type":"ContainerStarted","Data":"d6ea057673ad0f7c7ff7f55b968fbe5ea223a799a04253acdc92d3ec7bf036d4"} Oct 02 11:39:52 crc kubenswrapper[4658]: I1002 11:39:52.048938 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef5828e-3cb4-4a6d-ba04-f474234450d3","Type":"ContainerStarted","Data":"d0c1eba46b750f54c89a4e5a8164ac7c44c42a7f7170a8ab56bb45211b51b329"} Oct 02 11:39:52 crc kubenswrapper[4658]: I1002 11:39:52.342546 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:39:52 crc kubenswrapper[4658]: I1002 11:39:52.421762 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6xf49"] Oct 02 11:39:52 crc kubenswrapper[4658]: I1002 11:39:52.421988 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" podUID="51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" containerName="dnsmasq-dns" containerID="cri-o://7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f" gracePeriod=10 Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.008388 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.062504 4658 generic.go:334] "Generic (PLEG): container finished" podID="51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" containerID="7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f" exitCode=0 Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.062549 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" event={"ID":"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2","Type":"ContainerDied","Data":"7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f"} Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.062589 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.062600 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6xf49" event={"ID":"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2","Type":"ContainerDied","Data":"97cf79ef180cbc62fc353da2b7aa25b7fee3b1f1a867117b5a6b729e608e5ae9"} Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.062618 4658 scope.go:117] "RemoveContainer" containerID="7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.086694 4658 scope.go:117] "RemoveContainer" containerID="6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.111858 4658 scope.go:117] "RemoveContainer" containerID="7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f" Oct 02 11:39:53 crc kubenswrapper[4658]: E1002 11:39:53.112985 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f\": container with ID starting with 7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f not found: ID does not exist" containerID="7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.113026 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f"} err="failed to get container status \"7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f\": rpc error: code = NotFound desc = could not find container \"7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f\": container with ID starting with 7fe71e2ddc7c6002a1b0b3374c9fa57ad93558bc467f8b538bca39ca24af8c5f not found: ID does not exist" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.113054 4658 scope.go:117] "RemoveContainer" containerID="6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.113166 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-nb\") pod \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.113356 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-svc\") pod \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " Oct 02 11:39:53 crc kubenswrapper[4658]: E1002 11:39:53.113424 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0\": container with ID starting with 6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0 not found: ID does not exist" containerID="6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.113453 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-sb\") pod \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.113464 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0"} err="failed to get container status \"6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0\": rpc error: code = NotFound desc = could not find container \"6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0\": container with ID starting with 6c0f2a367f0f67c3342b6840d1ddd841f4bb819fe2b4977a6b9aba79aed968b0 not found: ID does not exist" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.113510 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ddd5\" (UniqueName: \"kubernetes.io/projected/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-kube-api-access-7ddd5\") pod \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.113545 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-config\") pod \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.113575 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-swift-storage-0\") pod \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\" (UID: \"51ba6acd-c67c-4f97-aea8-0121cb4bd4a2\") " Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.120485 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-kube-api-access-7ddd5" (OuterVolumeSpecName: "kube-api-access-7ddd5") pod "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" (UID: "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2"). InnerVolumeSpecName "kube-api-access-7ddd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.179919 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" (UID: "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.188125 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" (UID: "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.191157 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" (UID: "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.191699 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" (UID: "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.193819 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-config" (OuterVolumeSpecName: "config") pod "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" (UID: "51ba6acd-c67c-4f97-aea8-0121cb4bd4a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.216272 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.216503 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ddd5\" (UniqueName: \"kubernetes.io/projected/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-kube-api-access-7ddd5\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.216631 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.216693 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.216749 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.216837 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.416739 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6xf49"] Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.428656 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6xf49"] Oct 02 11:39:53 crc kubenswrapper[4658]: I1002 11:39:53.965825 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" path="/var/lib/kubelet/pods/51ba6acd-c67c-4f97-aea8-0121cb4bd4a2/volumes" Oct 02 11:39:55 crc kubenswrapper[4658]: I1002 11:39:55.094216 4658 generic.go:334] "Generic (PLEG): container finished" podID="8bf757ce-6767-4bed-98a4-394baf2cc6f8" containerID="4f1378482fc5e3aa5b781216f0c7930df6346459804584d54143aa11ee81ebe8" exitCode=0 Oct 02 11:39:55 crc kubenswrapper[4658]: I1002 11:39:55.094323 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fvg7b" event={"ID":"8bf757ce-6767-4bed-98a4-394baf2cc6f8","Type":"ContainerDied","Data":"4f1378482fc5e3aa5b781216f0c7930df6346459804584d54143aa11ee81ebe8"} Oct 02 11:39:55 crc kubenswrapper[4658]: I1002 11:39:55.097753 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef5828e-3cb4-4a6d-ba04-f474234450d3","Type":"ContainerStarted","Data":"cc4f0d8a60c599a54522e003d5bb69194fe92be11c24e46b60bf689a2ec44115"} Oct 02 11:39:55 crc kubenswrapper[4658]: I1002 11:39:55.097925 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:39:55 crc kubenswrapper[4658]: I1002 11:39:55.139213 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.103692307 podStartE2EDuration="6.139194592s" podCreationTimestamp="2025-10-02 11:39:49 +0000 UTC" firstStartedPulling="2025-10-02 11:39:49.957388559 +0000 UTC m=+1270.848542126" lastFinishedPulling="2025-10-02 11:39:53.992890844 +0000 UTC m=+1274.884044411" observedRunningTime="2025-10-02 11:39:55.131808941 +0000 UTC m=+1276.022962528" watchObservedRunningTime="2025-10-02 11:39:55.139194592 +0000 UTC m=+1276.030348159" Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.491595 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.597884 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-scripts\") pod \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.598009 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lb48\" (UniqueName: \"kubernetes.io/projected/8bf757ce-6767-4bed-98a4-394baf2cc6f8-kube-api-access-9lb48\") pod \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.598055 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-combined-ca-bundle\") pod \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.598231 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-config-data\") pod \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\" (UID: \"8bf757ce-6767-4bed-98a4-394baf2cc6f8\") " Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.610044 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-scripts" (OuterVolumeSpecName: "scripts") pod "8bf757ce-6767-4bed-98a4-394baf2cc6f8" (UID: "8bf757ce-6767-4bed-98a4-394baf2cc6f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.611441 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf757ce-6767-4bed-98a4-394baf2cc6f8-kube-api-access-9lb48" (OuterVolumeSpecName: "kube-api-access-9lb48") pod "8bf757ce-6767-4bed-98a4-394baf2cc6f8" (UID: "8bf757ce-6767-4bed-98a4-394baf2cc6f8"). InnerVolumeSpecName "kube-api-access-9lb48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.634935 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bf757ce-6767-4bed-98a4-394baf2cc6f8" (UID: "8bf757ce-6767-4bed-98a4-394baf2cc6f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.637409 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-config-data" (OuterVolumeSpecName: "config-data") pod "8bf757ce-6767-4bed-98a4-394baf2cc6f8" (UID: "8bf757ce-6767-4bed-98a4-394baf2cc6f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.701082 4658 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.701319 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lb48\" (UniqueName: \"kubernetes.io/projected/8bf757ce-6767-4bed-98a4-394baf2cc6f8-kube-api-access-9lb48\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.701331 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:56 crc kubenswrapper[4658]: I1002 11:39:56.701340 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bf757ce-6767-4bed-98a4-394baf2cc6f8-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.142625 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fvg7b" event={"ID":"8bf757ce-6767-4bed-98a4-394baf2cc6f8","Type":"ContainerDied","Data":"ef7c7d977fa3c43a18883009db2e95bf80e19a6478000f4c95b3c7bd8677f31e"} Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.142680 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef7c7d977fa3c43a18883009db2e95bf80e19a6478000f4c95b3c7bd8677f31e" Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.142714 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fvg7b" Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.299122 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.299509 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9dede2bc-0f08-4ce1-8977-c5427f0ad52f" containerName="nova-scheduler-scheduler" containerID="cri-o://82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe" gracePeriod=30 Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.338655 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.338728 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.339048 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-log" containerID="cri-o://3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5" gracePeriod=30 Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.339680 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerName="nova-api-log" containerID="cri-o://9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e" gracePeriod=30 Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.339816 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerName="nova-api-api" containerID="cri-o://25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0" gracePeriod=30 Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.339183 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-metadata" containerID="cri-o://a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91" gracePeriod=30 Oct 02 11:39:57 crc kubenswrapper[4658]: I1002 11:39:57.940776 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.052245 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b42e9b-71a4-4136-8d7b-2922065b2fde-logs\") pod \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.052670 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-internal-tls-certs\") pod \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.052777 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-combined-ca-bundle\") pod \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.052800 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-config-data\") pod \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.052814 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b42e9b-71a4-4136-8d7b-2922065b2fde-logs" (OuterVolumeSpecName: "logs") pod "e7b42e9b-71a4-4136-8d7b-2922065b2fde" (UID: "e7b42e9b-71a4-4136-8d7b-2922065b2fde"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.052959 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mpf6\" (UniqueName: \"kubernetes.io/projected/e7b42e9b-71a4-4136-8d7b-2922065b2fde-kube-api-access-9mpf6\") pod \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.053070 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-public-tls-certs\") pod \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\" (UID: \"e7b42e9b-71a4-4136-8d7b-2922065b2fde\") " Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.053630 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b42e9b-71a4-4136-8d7b-2922065b2fde-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.058693 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b42e9b-71a4-4136-8d7b-2922065b2fde-kube-api-access-9mpf6" (OuterVolumeSpecName: "kube-api-access-9mpf6") pod "e7b42e9b-71a4-4136-8d7b-2922065b2fde" (UID: "e7b42e9b-71a4-4136-8d7b-2922065b2fde"). InnerVolumeSpecName "kube-api-access-9mpf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.079970 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-config-data" (OuterVolumeSpecName: "config-data") pod "e7b42e9b-71a4-4136-8d7b-2922065b2fde" (UID: "e7b42e9b-71a4-4136-8d7b-2922065b2fde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.080409 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7b42e9b-71a4-4136-8d7b-2922065b2fde" (UID: "e7b42e9b-71a4-4136-8d7b-2922065b2fde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.111501 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e7b42e9b-71a4-4136-8d7b-2922065b2fde" (UID: "e7b42e9b-71a4-4136-8d7b-2922065b2fde"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.116878 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e7b42e9b-71a4-4136-8d7b-2922065b2fde" (UID: "e7b42e9b-71a4-4136-8d7b-2922065b2fde"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.154776 4658 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.155157 4658 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.155259 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.155340 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b42e9b-71a4-4136-8d7b-2922065b2fde-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.155416 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mpf6\" (UniqueName: \"kubernetes.io/projected/e7b42e9b-71a4-4136-8d7b-2922065b2fde-kube-api-access-9mpf6\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.156815 4658 generic.go:334] "Generic (PLEG): container finished" podID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerID="3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5" exitCode=143 Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.156945 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a2443d8-6a60-4b09-82d5-c3fe639cb819","Type":"ContainerDied","Data":"3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5"} Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.160035 4658 generic.go:334] "Generic (PLEG): container finished" podID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerID="25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0" exitCode=0 Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.160070 4658 generic.go:334] "Generic (PLEG): container finished" podID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerID="9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e" exitCode=143 Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.160091 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7b42e9b-71a4-4136-8d7b-2922065b2fde","Type":"ContainerDied","Data":"25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0"} Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.160118 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7b42e9b-71a4-4136-8d7b-2922065b2fde","Type":"ContainerDied","Data":"9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e"} Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.160130 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7b42e9b-71a4-4136-8d7b-2922065b2fde","Type":"ContainerDied","Data":"80c8a8ce1bb1e4bc3da3ae2355f5af019657a42bcf7174c845f624455e49cd49"} Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.160146 4658 scope.go:117] "RemoveContainer" containerID="25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.160276 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.185909 4658 scope.go:117] "RemoveContainer" containerID="9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.223155 4658 scope.go:117] "RemoveContainer" containerID="25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0" Oct 02 11:39:58 crc kubenswrapper[4658]: E1002 11:39:58.223921 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0\": container with ID starting with 25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0 not found: ID does not exist" containerID="25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.223974 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0"} err="failed to get container status \"25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0\": rpc error: code = NotFound desc = could not find container \"25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0\": container with ID starting with 25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0 not found: ID does not exist" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.224038 4658 scope.go:117] "RemoveContainer" containerID="9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e" Oct 02 11:39:58 crc kubenswrapper[4658]: E1002 11:39:58.224501 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e\": container with ID starting with 9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e not found: ID does not exist" containerID="9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.224524 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e"} err="failed to get container status \"9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e\": rpc error: code = NotFound desc = could not find container \"9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e\": container with ID starting with 9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e not found: ID does not exist" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.224539 4658 scope.go:117] "RemoveContainer" containerID="25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.224999 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0"} err="failed to get container status \"25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0\": rpc error: code = NotFound desc = could not find container \"25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0\": container with ID starting with 25f5d68fafc9fcd102df36b733e285b47659dc3219225e8564cbf79e433f32e0 not found: ID does not exist" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.225019 4658 scope.go:117] "RemoveContainer" containerID="9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.225812 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e"} err="failed to get container status \"9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e\": rpc error: code = NotFound desc = could not find container \"9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e\": container with ID starting with 9965e2e2435867d663c82c1ea193f689b3d0c2616110a8c13f7188aae7d3ac0e not found: ID does not exist" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.245498 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.282358 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300095 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:58 crc kubenswrapper[4658]: E1002 11:39:58.300560 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf757ce-6767-4bed-98a4-394baf2cc6f8" containerName="nova-manage" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300581 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf757ce-6767-4bed-98a4-394baf2cc6f8" containerName="nova-manage" Oct 02 11:39:58 crc kubenswrapper[4658]: E1002 11:39:58.300596 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerName="nova-api-api" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300604 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerName="nova-api-api" Oct 02 11:39:58 crc kubenswrapper[4658]: E1002 11:39:58.300622 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerName="nova-api-log" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300631 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerName="nova-api-log" Oct 02 11:39:58 crc kubenswrapper[4658]: E1002 11:39:58.300656 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" containerName="dnsmasq-dns" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300664 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" containerName="dnsmasq-dns" Oct 02 11:39:58 crc kubenswrapper[4658]: E1002 11:39:58.300689 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" containerName="init" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300696 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" containerName="init" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300929 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerName="nova-api-log" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300950 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf757ce-6767-4bed-98a4-394baf2cc6f8" containerName="nova-manage" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300967 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" containerName="nova-api-api" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.300978 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ba6acd-c67c-4f97-aea8-0121cb4bd4a2" containerName="dnsmasq-dns" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.302281 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.307210 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.318809 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.319030 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.319217 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.461199 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx6fr\" (UniqueName: \"kubernetes.io/projected/2b26ff3c-8765-4911-aee2-54a863e4fd7c-kube-api-access-hx6fr\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.461410 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.461631 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b26ff3c-8765-4911-aee2-54a863e4fd7c-logs\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.461768 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-public-tls-certs\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.461802 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.461999 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-config-data\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.564374 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.564827 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-config-data\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.564871 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx6fr\" (UniqueName: \"kubernetes.io/projected/2b26ff3c-8765-4911-aee2-54a863e4fd7c-kube-api-access-hx6fr\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.564916 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.564980 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b26ff3c-8765-4911-aee2-54a863e4fd7c-logs\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.565036 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-public-tls-certs\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.565461 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b26ff3c-8765-4911-aee2-54a863e4fd7c-logs\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.568530 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-config-data\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.568722 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-public-tls-certs\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.572382 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.573274 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b26ff3c-8765-4911-aee2-54a863e4fd7c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.583771 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx6fr\" (UniqueName: \"kubernetes.io/projected/2b26ff3c-8765-4911-aee2-54a863e4fd7c-kube-api-access-hx6fr\") pod \"nova-api-0\" (UID: \"2b26ff3c-8765-4911-aee2-54a863e4fd7c\") " pod="openstack/nova-api-0" Oct 02 11:39:58 crc kubenswrapper[4658]: I1002 11:39:58.643196 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:39:59 crc kubenswrapper[4658]: E1002 11:39:58.920734 4658 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe is running failed: container process not found" containerID="82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:39:59 crc kubenswrapper[4658]: E1002 11:39:58.921595 4658 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe is running failed: container process not found" containerID="82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:39:59 crc kubenswrapper[4658]: E1002 11:39:58.922473 4658 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe is running failed: container process not found" containerID="82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:39:59 crc kubenswrapper[4658]: E1002 11:39:58.922513 4658 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9dede2bc-0f08-4ce1-8977-c5427f0ad52f" containerName="nova-scheduler-scheduler" Oct 02 11:39:59 crc kubenswrapper[4658]: I1002 11:39:59.171174 4658 generic.go:334] "Generic (PLEG): container finished" podID="9dede2bc-0f08-4ce1-8977-c5427f0ad52f" containerID="82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe" exitCode=0 Oct 02 11:39:59 crc kubenswrapper[4658]: I1002 11:39:59.171245 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9dede2bc-0f08-4ce1-8977-c5427f0ad52f","Type":"ContainerDied","Data":"82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe"} Oct 02 11:39:59 crc kubenswrapper[4658]: I1002 11:39:59.960976 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b42e9b-71a4-4136-8d7b-2922065b2fde" path="/var/lib/kubelet/pods/e7b42e9b-71a4-4136-8d7b-2922065b2fde/volumes" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.062980 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.152381 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.197097 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9dede2bc-0f08-4ce1-8977-c5427f0ad52f","Type":"ContainerDied","Data":"be3fca7353312b5d874d06cdc15b13538ca04ef07250aa9a0a823de4c33c8af2"} Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.197168 4658 scope.go:117] "RemoveContainer" containerID="82310c66edf688149b41875f4da73fd8455f3588771aeca3a1cd0236b05c65fe" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.197354 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.199451 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b26ff3c-8765-4911-aee2-54a863e4fd7c","Type":"ContainerStarted","Data":"5c5c1f1e776a963194093a82782c82df926a0c45680cc4d5763ab1a4e82e6ccf"} Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.300279 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-config-data\") pod \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.300552 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-combined-ca-bundle\") pod \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.300632 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-kube-api-access-5rmkm\") pod \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\" (UID: \"9dede2bc-0f08-4ce1-8977-c5427f0ad52f\") " Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.303239 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-kube-api-access-5rmkm" (OuterVolumeSpecName: "kube-api-access-5rmkm") pod "9dede2bc-0f08-4ce1-8977-c5427f0ad52f" (UID: "9dede2bc-0f08-4ce1-8977-c5427f0ad52f"). InnerVolumeSpecName "kube-api-access-5rmkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.326739 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dede2bc-0f08-4ce1-8977-c5427f0ad52f" (UID: "9dede2bc-0f08-4ce1-8977-c5427f0ad52f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.329101 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-config-data" (OuterVolumeSpecName: "config-data") pod "9dede2bc-0f08-4ce1-8977-c5427f0ad52f" (UID: "9dede2bc-0f08-4ce1-8977-c5427f0ad52f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.402610 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.402648 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.402657 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/9dede2bc-0f08-4ce1-8977-c5427f0ad52f-kube-api-access-5rmkm\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.629106 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.645128 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.660343 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:40:00 crc kubenswrapper[4658]: E1002 11:40:00.660791 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dede2bc-0f08-4ce1-8977-c5427f0ad52f" containerName="nova-scheduler-scheduler" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.660803 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dede2bc-0f08-4ce1-8977-c5427f0ad52f" containerName="nova-scheduler-scheduler" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.660980 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dede2bc-0f08-4ce1-8977-c5427f0ad52f" containerName="nova-scheduler-scheduler" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.661802 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.675186 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.698460 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.813388 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c066a72f-72df-47f5-b481-12ba73cb8d5f-config-data\") pod \"nova-scheduler-0\" (UID: \"c066a72f-72df-47f5-b481-12ba73cb8d5f\") " pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.813553 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br4t6\" (UniqueName: \"kubernetes.io/projected/c066a72f-72df-47f5-b481-12ba73cb8d5f-kube-api-access-br4t6\") pod \"nova-scheduler-0\" (UID: \"c066a72f-72df-47f5-b481-12ba73cb8d5f\") " pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.813581 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c066a72f-72df-47f5-b481-12ba73cb8d5f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c066a72f-72df-47f5-b481-12ba73cb8d5f\") " pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.915728 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br4t6\" (UniqueName: \"kubernetes.io/projected/c066a72f-72df-47f5-b481-12ba73cb8d5f-kube-api-access-br4t6\") pod \"nova-scheduler-0\" (UID: \"c066a72f-72df-47f5-b481-12ba73cb8d5f\") " pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.915793 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c066a72f-72df-47f5-b481-12ba73cb8d5f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c066a72f-72df-47f5-b481-12ba73cb8d5f\") " pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.917120 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c066a72f-72df-47f5-b481-12ba73cb8d5f-config-data\") pod \"nova-scheduler-0\" (UID: \"c066a72f-72df-47f5-b481-12ba73cb8d5f\") " pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.921723 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c066a72f-72df-47f5-b481-12ba73cb8d5f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c066a72f-72df-47f5-b481-12ba73cb8d5f\") " pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.922651 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c066a72f-72df-47f5-b481-12ba73cb8d5f-config-data\") pod \"nova-scheduler-0\" (UID: \"c066a72f-72df-47f5-b481-12ba73cb8d5f\") " pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.936878 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br4t6\" (UniqueName: \"kubernetes.io/projected/c066a72f-72df-47f5-b481-12ba73cb8d5f-kube-api-access-br4t6\") pod \"nova-scheduler-0\" (UID: \"c066a72f-72df-47f5-b481-12ba73cb8d5f\") " pod="openstack/nova-scheduler-0" Oct 02 11:40:00 crc kubenswrapper[4658]: I1002 11:40:00.986256 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.018274 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.120243 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-nova-metadata-tls-certs\") pod \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.120363 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz8pq\" (UniqueName: \"kubernetes.io/projected/7a2443d8-6a60-4b09-82d5-c3fe639cb819-kube-api-access-zz8pq\") pod \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.120433 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2443d8-6a60-4b09-82d5-c3fe639cb819-logs\") pod \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.120519 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-combined-ca-bundle\") pod \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.120604 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-config-data\") pod \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\" (UID: \"7a2443d8-6a60-4b09-82d5-c3fe639cb819\") " Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.122479 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a2443d8-6a60-4b09-82d5-c3fe639cb819-logs" (OuterVolumeSpecName: "logs") pod "7a2443d8-6a60-4b09-82d5-c3fe639cb819" (UID: "7a2443d8-6a60-4b09-82d5-c3fe639cb819"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.127045 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2443d8-6a60-4b09-82d5-c3fe639cb819-kube-api-access-zz8pq" (OuterVolumeSpecName: "kube-api-access-zz8pq") pod "7a2443d8-6a60-4b09-82d5-c3fe639cb819" (UID: "7a2443d8-6a60-4b09-82d5-c3fe639cb819"). InnerVolumeSpecName "kube-api-access-zz8pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.150201 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-config-data" (OuterVolumeSpecName: "config-data") pod "7a2443d8-6a60-4b09-82d5-c3fe639cb819" (UID: "7a2443d8-6a60-4b09-82d5-c3fe639cb819"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.164351 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a2443d8-6a60-4b09-82d5-c3fe639cb819" (UID: "7a2443d8-6a60-4b09-82d5-c3fe639cb819"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.175880 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7a2443d8-6a60-4b09-82d5-c3fe639cb819" (UID: "7a2443d8-6a60-4b09-82d5-c3fe639cb819"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.212093 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b26ff3c-8765-4911-aee2-54a863e4fd7c","Type":"ContainerStarted","Data":"ad60ccbd3ade4b1de3b96dc3d4d1aabf1b3ebf7949b0a749854cd6958e63407c"} Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.212136 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b26ff3c-8765-4911-aee2-54a863e4fd7c","Type":"ContainerStarted","Data":"ddbca713d4be9f8e0c20919d7b81141ce1d6f617c9577dd55ab65a3173222f86"} Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.214656 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.214689 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a2443d8-6a60-4b09-82d5-c3fe639cb819","Type":"ContainerDied","Data":"a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91"} Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.214739 4658 scope.go:117] "RemoveContainer" containerID="a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.214571 4658 generic.go:334] "Generic (PLEG): container finished" podID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerID="a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91" exitCode=0 Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.214828 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a2443d8-6a60-4b09-82d5-c3fe639cb819","Type":"ContainerDied","Data":"405c988b32dd99a458f8f293eb30874b4bc6eee4b232c4d299f50d422d51f6a1"} Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.222452 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.222478 4658 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.222488 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz8pq\" (UniqueName: \"kubernetes.io/projected/7a2443d8-6a60-4b09-82d5-c3fe639cb819-kube-api-access-zz8pq\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.222496 4658 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2443d8-6a60-4b09-82d5-c3fe639cb819-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.222507 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2443d8-6a60-4b09-82d5-c3fe639cb819-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.243242 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.243219329 podStartE2EDuration="3.243219329s" podCreationTimestamp="2025-10-02 11:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:40:01.230769925 +0000 UTC m=+1282.121923492" watchObservedRunningTime="2025-10-02 11:40:01.243219329 +0000 UTC m=+1282.134372886" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.244787 4658 scope.go:117] "RemoveContainer" containerID="3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.258433 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.280874 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:40:01 crc kubenswrapper[4658]: E1002 11:40:01.282796 4658 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a2443d8_6a60_4b09_82d5_c3fe639cb819.slice/crio-405c988b32dd99a458f8f293eb30874b4bc6eee4b232c4d299f50d422d51f6a1\": RecentStats: unable to find data in memory cache]" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.286110 4658 scope.go:117] "RemoveContainer" containerID="a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91" Oct 02 11:40:01 crc kubenswrapper[4658]: E1002 11:40:01.286489 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91\": container with ID starting with a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91 not found: ID does not exist" containerID="a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.286527 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91"} err="failed to get container status \"a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91\": rpc error: code = NotFound desc = could not find container \"a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91\": container with ID starting with a9e272acc052c28dddd372161285e88f2c147a5d0f3e76a5a353846909b2fe91 not found: ID does not exist" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.286548 4658 scope.go:117] "RemoveContainer" containerID="3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5" Oct 02 11:40:01 crc kubenswrapper[4658]: E1002 11:40:01.287235 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5\": container with ID starting with 3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5 not found: ID does not exist" containerID="3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.287385 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5"} err="failed to get container status \"3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5\": rpc error: code = NotFound desc = could not find container \"3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5\": container with ID starting with 3f54257bab2b6e2dd7831f145928d5ae9d06a0c492e254474bfe126555820ef5 not found: ID does not exist" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.296188 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:40:01 crc kubenswrapper[4658]: E1002 11:40:01.296614 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-log" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.296632 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-log" Oct 02 11:40:01 crc kubenswrapper[4658]: E1002 11:40:01.296654 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-metadata" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.296661 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-metadata" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.296862 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-log" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.296890 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" containerName="nova-metadata-metadata" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.298052 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.301018 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.301167 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.307947 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.409020 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:40:01 crc kubenswrapper[4658]: W1002 11:40:01.409861 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc066a72f_72df_47f5_b481_12ba73cb8d5f.slice/crio-dcf73123163ca781c03525975fe38952ae84aeb3ba5caf9ecef998a8e5056b7f WatchSource:0}: Error finding container dcf73123163ca781c03525975fe38952ae84aeb3ba5caf9ecef998a8e5056b7f: Status 404 returned error can't find the container with id dcf73123163ca781c03525975fe38952ae84aeb3ba5caf9ecef998a8e5056b7f Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.425801 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f818de7d-6833-4011-aded-a3de906237c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.425965 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f818de7d-6833-4011-aded-a3de906237c4-config-data\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.426004 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f818de7d-6833-4011-aded-a3de906237c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.426037 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dsjt\" (UniqueName: \"kubernetes.io/projected/f818de7d-6833-4011-aded-a3de906237c4-kube-api-access-4dsjt\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.426062 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f818de7d-6833-4011-aded-a3de906237c4-logs\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.528206 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f818de7d-6833-4011-aded-a3de906237c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.528279 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dsjt\" (UniqueName: \"kubernetes.io/projected/f818de7d-6833-4011-aded-a3de906237c4-kube-api-access-4dsjt\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.528328 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f818de7d-6833-4011-aded-a3de906237c4-logs\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.528730 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f818de7d-6833-4011-aded-a3de906237c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.529427 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f818de7d-6833-4011-aded-a3de906237c4-config-data\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.529786 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f818de7d-6833-4011-aded-a3de906237c4-logs\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.533210 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f818de7d-6833-4011-aded-a3de906237c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.533612 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f818de7d-6833-4011-aded-a3de906237c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.534939 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f818de7d-6833-4011-aded-a3de906237c4-config-data\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.545173 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dsjt\" (UniqueName: \"kubernetes.io/projected/f818de7d-6833-4011-aded-a3de906237c4-kube-api-access-4dsjt\") pod \"nova-metadata-0\" (UID: \"f818de7d-6833-4011-aded-a3de906237c4\") " pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.622970 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.963438 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2443d8-6a60-4b09-82d5-c3fe639cb819" path="/var/lib/kubelet/pods/7a2443d8-6a60-4b09-82d5-c3fe639cb819/volumes" Oct 02 11:40:01 crc kubenswrapper[4658]: I1002 11:40:01.964817 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dede2bc-0f08-4ce1-8977-c5427f0ad52f" path="/var/lib/kubelet/pods/9dede2bc-0f08-4ce1-8977-c5427f0ad52f/volumes" Oct 02 11:40:02 crc kubenswrapper[4658]: I1002 11:40:02.083843 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:40:02 crc kubenswrapper[4658]: W1002 11:40:02.090382 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf818de7d_6833_4011_aded_a3de906237c4.slice/crio-5312faa7c01f339f04817274afef9bd44c1c33dad8bf0d34656ab97cba577119 WatchSource:0}: Error finding container 5312faa7c01f339f04817274afef9bd44c1c33dad8bf0d34656ab97cba577119: Status 404 returned error can't find the container with id 5312faa7c01f339f04817274afef9bd44c1c33dad8bf0d34656ab97cba577119 Oct 02 11:40:02 crc kubenswrapper[4658]: I1002 11:40:02.228266 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f818de7d-6833-4011-aded-a3de906237c4","Type":"ContainerStarted","Data":"5312faa7c01f339f04817274afef9bd44c1c33dad8bf0d34656ab97cba577119"} Oct 02 11:40:02 crc kubenswrapper[4658]: I1002 11:40:02.229569 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c066a72f-72df-47f5-b481-12ba73cb8d5f","Type":"ContainerStarted","Data":"713cfdf75f1f029f7d57e0845c9a32924a0bc0583fdb3e73ba6a9bb3989d5e2b"} Oct 02 11:40:02 crc kubenswrapper[4658]: I1002 11:40:02.229641 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c066a72f-72df-47f5-b481-12ba73cb8d5f","Type":"ContainerStarted","Data":"dcf73123163ca781c03525975fe38952ae84aeb3ba5caf9ecef998a8e5056b7f"} Oct 02 11:40:02 crc kubenswrapper[4658]: I1002 11:40:02.251011 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.250992504 podStartE2EDuration="2.250992504s" podCreationTimestamp="2025-10-02 11:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:40:02.244327048 +0000 UTC m=+1283.135480645" watchObservedRunningTime="2025-10-02 11:40:02.250992504 +0000 UTC m=+1283.142146071" Oct 02 11:40:03 crc kubenswrapper[4658]: I1002 11:40:03.242432 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f818de7d-6833-4011-aded-a3de906237c4","Type":"ContainerStarted","Data":"c94ad4bcb8d7862411312babb06a51a8614be4e79ef8858761b60871cafb50e2"} Oct 02 11:40:03 crc kubenswrapper[4658]: I1002 11:40:03.243027 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f818de7d-6833-4011-aded-a3de906237c4","Type":"ContainerStarted","Data":"4073294ee5f69f19eaa78dffd0330c428b1510d595a66d9ca7041cf01e8449d5"} Oct 02 11:40:03 crc kubenswrapper[4658]: I1002 11:40:03.273934 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.273909332 podStartE2EDuration="2.273909332s" podCreationTimestamp="2025-10-02 11:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:40:03.264159774 +0000 UTC m=+1284.155313351" watchObservedRunningTime="2025-10-02 11:40:03.273909332 +0000 UTC m=+1284.165062909" Oct 02 11:40:05 crc kubenswrapper[4658]: I1002 11:40:05.986581 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:40:06 crc kubenswrapper[4658]: I1002 11:40:06.623766 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:40:06 crc kubenswrapper[4658]: I1002 11:40:06.623809 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:40:08 crc kubenswrapper[4658]: I1002 11:40:08.643906 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:40:08 crc kubenswrapper[4658]: I1002 11:40:08.644264 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:40:09 crc kubenswrapper[4658]: I1002 11:40:09.657520 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2b26ff3c-8765-4911-aee2-54a863e4fd7c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:40:09 crc kubenswrapper[4658]: I1002 11:40:09.657562 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2b26ff3c-8765-4911-aee2-54a863e4fd7c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:40:10 crc kubenswrapper[4658]: I1002 11:40:10.986375 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:40:11 crc kubenswrapper[4658]: I1002 11:40:11.023034 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:40:11 crc kubenswrapper[4658]: I1002 11:40:11.348739 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:40:11 crc kubenswrapper[4658]: I1002 11:40:11.623993 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:40:11 crc kubenswrapper[4658]: I1002 11:40:11.624356 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:40:12 crc kubenswrapper[4658]: I1002 11:40:12.641422 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f818de7d-6833-4011-aded-a3de906237c4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:40:12 crc kubenswrapper[4658]: I1002 11:40:12.641422 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f818de7d-6833-4011-aded-a3de906237c4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:40:18 crc kubenswrapper[4658]: I1002 11:40:18.652562 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:40:18 crc kubenswrapper[4658]: I1002 11:40:18.653205 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:40:18 crc kubenswrapper[4658]: I1002 11:40:18.653566 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:40:18 crc kubenswrapper[4658]: I1002 11:40:18.653605 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:40:18 crc kubenswrapper[4658]: I1002 11:40:18.658782 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:40:18 crc kubenswrapper[4658]: I1002 11:40:18.661823 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:40:19 crc kubenswrapper[4658]: I1002 11:40:19.418942 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:40:21 crc kubenswrapper[4658]: I1002 11:40:21.647096 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:40:21 crc kubenswrapper[4658]: I1002 11:40:21.647780 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:40:21 crc kubenswrapper[4658]: I1002 11:40:21.674432 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:40:21 crc kubenswrapper[4658]: I1002 11:40:21.683788 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:40:27 crc kubenswrapper[4658]: I1002 11:40:27.430046 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:40:27 crc kubenswrapper[4658]: I1002 11:40:27.430635 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:40:30 crc kubenswrapper[4658]: I1002 11:40:30.680539 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:40:31 crc kubenswrapper[4658]: I1002 11:40:31.456084 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:40:35 crc kubenswrapper[4658]: I1002 11:40:35.731893 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="4cc6649a-7a89-4658-9a2d-a09cb4f5f860" containerName="rabbitmq" containerID="cri-o://08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6" gracePeriod=604796 Oct 02 11:40:36 crc kubenswrapper[4658]: I1002 11:40:36.069054 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="8aa01b90-7cce-4e10-ac37-57df39a56df1" containerName="rabbitmq" containerID="cri-o://d6da967d9b926334a1d04ebdc6f06a85006a898955b304def4658298bf259026" gracePeriod=604795 Oct 02 11:40:42 crc kubenswrapper[4658]: E1002 11:40:42.374051 4658 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aa01b90_7cce_4e10_ac37_57df39a56df1.slice/crio-conmon-d6da967d9b926334a1d04ebdc6f06a85006a898955b304def4658298bf259026.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aa01b90_7cce_4e10_ac37_57df39a56df1.slice/crio-d6da967d9b926334a1d04ebdc6f06a85006a898955b304def4658298bf259026.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.437483 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531024 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-tls\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531085 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-erlang-cookie-secret\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531174 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531308 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-config-data\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531368 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hdmj\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-kube-api-access-5hdmj\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531402 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-erlang-cookie\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531426 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-server-conf\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531463 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-plugins\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531529 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-confd\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531565 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-pod-info\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.531626 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-plugins-conf\") pod \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\" (UID: \"4cc6649a-7a89-4658-9a2d-a09cb4f5f860\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.534658 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.546424 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.549204 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.558410 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.578083 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.580486 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-kube-api-access-5hdmj" (OuterVolumeSpecName: "kube-api-access-5hdmj") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "kube-api-access-5hdmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.583574 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-pod-info" (OuterVolumeSpecName: "pod-info") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.589113 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.633749 4658 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.633784 4658 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.633797 4658 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.633825 4658 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.633839 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hdmj\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-kube-api-access-5hdmj\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.633853 4658 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.633865 4658 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.633876 4658 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.640050 4658 generic.go:334] "Generic (PLEG): container finished" podID="4cc6649a-7a89-4658-9a2d-a09cb4f5f860" containerID="08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6" exitCode=0 Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.640113 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cc6649a-7a89-4658-9a2d-a09cb4f5f860","Type":"ContainerDied","Data":"08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6"} Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.640140 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cc6649a-7a89-4658-9a2d-a09cb4f5f860","Type":"ContainerDied","Data":"9189d902497667418e1e0de6261191f11ca4858e09eb937e77281963dba3794b"} Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.640157 4658 scope.go:117] "RemoveContainer" containerID="08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.640158 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.643378 4658 generic.go:334] "Generic (PLEG): container finished" podID="8aa01b90-7cce-4e10-ac37-57df39a56df1" containerID="d6da967d9b926334a1d04ebdc6f06a85006a898955b304def4658298bf259026" exitCode=0 Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.643414 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8aa01b90-7cce-4e10-ac37-57df39a56df1","Type":"ContainerDied","Data":"d6da967d9b926334a1d04ebdc6f06a85006a898955b304def4658298bf259026"} Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.646927 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-config-data" (OuterVolumeSpecName: "config-data") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.683886 4658 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.687265 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-server-conf" (OuterVolumeSpecName: "server-conf") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.727079 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.735955 4658 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.735989 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.736000 4658 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.744618 4658 scope.go:117] "RemoveContainer" containerID="8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.778472 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4cc6649a-7a89-4658-9a2d-a09cb4f5f860" (UID: "4cc6649a-7a89-4658-9a2d-a09cb4f5f860"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.800714 4658 scope.go:117] "RemoveContainer" containerID="08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6" Oct 02 11:40:42 crc kubenswrapper[4658]: E1002 11:40:42.801339 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6\": container with ID starting with 08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6 not found: ID does not exist" containerID="08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.801382 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6"} err="failed to get container status \"08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6\": rpc error: code = NotFound desc = could not find container \"08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6\": container with ID starting with 08d5feee318663606b9273b600c02af5686dfb7fe21ef0a6d2190d91d1b95af6 not found: ID does not exist" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.801413 4658 scope.go:117] "RemoveContainer" containerID="8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73" Oct 02 11:40:42 crc kubenswrapper[4658]: E1002 11:40:42.801772 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73\": container with ID starting with 8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73 not found: ID does not exist" containerID="8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.801799 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73"} err="failed to get container status \"8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73\": rpc error: code = NotFound desc = could not find container \"8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73\": container with ID starting with 8a031c8a231a7e7aa7f1dd4ff8ae554dada26c0e9cdfd14e12f1fec004b11c73 not found: ID does not exist" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.837870 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-config-data\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838093 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-tls\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838226 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-confd\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838347 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aa01b90-7cce-4e10-ac37-57df39a56df1-erlang-cookie-secret\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838371 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-server-conf\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838391 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-plugins-conf\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838412 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-plugins\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838555 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-erlang-cookie\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838611 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aa01b90-7cce-4e10-ac37-57df39a56df1-pod-info\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838635 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.838669 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2flp\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-kube-api-access-x2flp\") pod \"8aa01b90-7cce-4e10-ac37-57df39a56df1\" (UID: \"8aa01b90-7cce-4e10-ac37-57df39a56df1\") " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.839345 4658 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cc6649a-7a89-4658-9a2d-a09cb4f5f860-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.842400 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.843899 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-kube-api-access-x2flp" (OuterVolumeSpecName: "kube-api-access-x2flp") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "kube-api-access-x2flp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.843954 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.845256 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.846499 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.847450 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.848404 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8aa01b90-7cce-4e10-ac37-57df39a56df1-pod-info" (OuterVolumeSpecName: "pod-info") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.851805 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa01b90-7cce-4e10-ac37-57df39a56df1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.873589 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-config-data" (OuterVolumeSpecName: "config-data") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.947353 4658 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.947438 4658 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aa01b90-7cce-4e10-ac37-57df39a56df1-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.947500 4658 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.947516 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2flp\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-kube-api-access-x2flp\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.947530 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.947541 4658 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.947642 4658 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aa01b90-7cce-4e10-ac37-57df39a56df1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.947660 4658 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.947673 4658 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:42 crc kubenswrapper[4658]: I1002 11:40:42.958440 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-server-conf" (OuterVolumeSpecName: "server-conf") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.011206 4658 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.052526 4658 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aa01b90-7cce-4e10-ac37-57df39a56df1-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.052565 4658 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.066605 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8aa01b90-7cce-4e10-ac37-57df39a56df1" (UID: "8aa01b90-7cce-4e10-ac37-57df39a56df1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.150141 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.154857 4658 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aa01b90-7cce-4e10-ac37-57df39a56df1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.166991 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.189118 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:40:43 crc kubenswrapper[4658]: E1002 11:40:43.189522 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc6649a-7a89-4658-9a2d-a09cb4f5f860" containerName="rabbitmq" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.189545 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc6649a-7a89-4658-9a2d-a09cb4f5f860" containerName="rabbitmq" Oct 02 11:40:43 crc kubenswrapper[4658]: E1002 11:40:43.189565 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa01b90-7cce-4e10-ac37-57df39a56df1" containerName="setup-container" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.189572 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa01b90-7cce-4e10-ac37-57df39a56df1" containerName="setup-container" Oct 02 11:40:43 crc kubenswrapper[4658]: E1002 11:40:43.189582 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa01b90-7cce-4e10-ac37-57df39a56df1" containerName="rabbitmq" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.189589 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa01b90-7cce-4e10-ac37-57df39a56df1" containerName="rabbitmq" Oct 02 11:40:43 crc kubenswrapper[4658]: E1002 11:40:43.189615 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc6649a-7a89-4658-9a2d-a09cb4f5f860" containerName="setup-container" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.189621 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc6649a-7a89-4658-9a2d-a09cb4f5f860" containerName="setup-container" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.189822 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa01b90-7cce-4e10-ac37-57df39a56df1" containerName="rabbitmq" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.189835 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cc6649a-7a89-4658-9a2d-a09cb4f5f860" containerName="rabbitmq" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.190843 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.193500 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.193832 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.196267 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.199025 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.199210 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tzrzj" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.199375 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.199395 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.233579 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256281 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6406a7e-4303-43ed-bb07-2816e29af04c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256385 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256436 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6406a7e-4303-43ed-bb07-2816e29af04c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256452 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256474 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6406a7e-4303-43ed-bb07-2816e29af04c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256492 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256508 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6406a7e-4303-43ed-bb07-2816e29af04c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256528 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256549 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256602 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4gs6\" (UniqueName: \"kubernetes.io/projected/c6406a7e-4303-43ed-bb07-2816e29af04c-kube-api-access-j4gs6\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.256620 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6406a7e-4303-43ed-bb07-2816e29af04c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.357822 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6406a7e-4303-43ed-bb07-2816e29af04c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358066 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358196 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6406a7e-4303-43ed-bb07-2816e29af04c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358282 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358427 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6406a7e-4303-43ed-bb07-2816e29af04c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358502 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358578 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358582 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358680 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4gs6\" (UniqueName: \"kubernetes.io/projected/c6406a7e-4303-43ed-bb07-2816e29af04c-kube-api-access-j4gs6\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358699 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6406a7e-4303-43ed-bb07-2816e29af04c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358776 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6406a7e-4303-43ed-bb07-2816e29af04c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.358855 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.359023 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6406a7e-4303-43ed-bb07-2816e29af04c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.359223 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.360101 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6406a7e-4303-43ed-bb07-2816e29af04c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.360470 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.361133 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6406a7e-4303-43ed-bb07-2816e29af04c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.362697 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.363136 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6406a7e-4303-43ed-bb07-2816e29af04c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.364582 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6406a7e-4303-43ed-bb07-2816e29af04c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.368958 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6406a7e-4303-43ed-bb07-2816e29af04c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.381577 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4gs6\" (UniqueName: \"kubernetes.io/projected/c6406a7e-4303-43ed-bb07-2816e29af04c-kube-api-access-j4gs6\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.398665 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6406a7e-4303-43ed-bb07-2816e29af04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.511338 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.674440 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8aa01b90-7cce-4e10-ac37-57df39a56df1","Type":"ContainerDied","Data":"9c7072a35270fcb05ddffab44861d3786fd1c97783db9a19b3db1b7b220031e5"} Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.674844 4658 scope.go:117] "RemoveContainer" containerID="d6da967d9b926334a1d04ebdc6f06a85006a898955b304def4658298bf259026" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.674621 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.721710 4658 scope.go:117] "RemoveContainer" containerID="962021eda53525352e51f7521305c62cf0b06e8762581492eb65c40a47f21d30" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.726161 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.737084 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.763613 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.777023 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.779432 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.779794 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.780195 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.780432 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-v52cg" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.780613 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.782987 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.787279 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.814221 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869248 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a129e57-376b-4bc6-8d0c-c667d692d487-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869588 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a129e57-376b-4bc6-8d0c-c667d692d487-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869618 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a129e57-376b-4bc6-8d0c-c667d692d487-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869636 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869666 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869684 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869703 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a129e57-376b-4bc6-8d0c-c667d692d487-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869757 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prvsh\" (UniqueName: \"kubernetes.io/projected/6a129e57-376b-4bc6-8d0c-c667d692d487-kube-api-access-prvsh\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869776 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a129e57-376b-4bc6-8d0c-c667d692d487-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869802 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.869836 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972007 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a129e57-376b-4bc6-8d0c-c667d692d487-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972099 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a129e57-376b-4bc6-8d0c-c667d692d487-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972124 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a129e57-376b-4bc6-8d0c-c667d692d487-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972144 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972167 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972188 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972206 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a129e57-376b-4bc6-8d0c-c667d692d487-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972250 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prvsh\" (UniqueName: \"kubernetes.io/projected/6a129e57-376b-4bc6-8d0c-c667d692d487-kube-api-access-prvsh\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972269 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a129e57-376b-4bc6-8d0c-c667d692d487-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972318 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972355 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.972900 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.975343 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.975716 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.975955 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a129e57-376b-4bc6-8d0c-c667d692d487-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.976176 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a129e57-376b-4bc6-8d0c-c667d692d487-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.977204 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc6649a-7a89-4658-9a2d-a09cb4f5f860" path="/var/lib/kubelet/pods/4cc6649a-7a89-4658-9a2d-a09cb4f5f860/volumes" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.977396 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a129e57-376b-4bc6-8d0c-c667d692d487-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.980033 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aa01b90-7cce-4e10-ac37-57df39a56df1" path="/var/lib/kubelet/pods/8aa01b90-7cce-4e10-ac37-57df39a56df1/volumes" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.985041 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a129e57-376b-4bc6-8d0c-c667d692d487-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.985049 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.986924 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a129e57-376b-4bc6-8d0c-c667d692d487-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:43 crc kubenswrapper[4658]: I1002 11:40:43.992310 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a129e57-376b-4bc6-8d0c-c667d692d487-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:44 crc kubenswrapper[4658]: I1002 11:40:44.001070 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prvsh\" (UniqueName: \"kubernetes.io/projected/6a129e57-376b-4bc6-8d0c-c667d692d487-kube-api-access-prvsh\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:44 crc kubenswrapper[4658]: I1002 11:40:44.003943 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:40:44 crc kubenswrapper[4658]: I1002 11:40:44.029996 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6a129e57-376b-4bc6-8d0c-c667d692d487\") " pod="openstack/rabbitmq-server-0" Oct 02 11:40:44 crc kubenswrapper[4658]: I1002 11:40:44.123403 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:40:44 crc kubenswrapper[4658]: I1002 11:40:44.610441 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:40:44 crc kubenswrapper[4658]: W1002 11:40:44.612747 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a129e57_376b_4bc6_8d0c_c667d692d487.slice/crio-1239d644daa5c3a9547850377e9071674329bd1f1de35be0f60d8be1e448d985 WatchSource:0}: Error finding container 1239d644daa5c3a9547850377e9071674329bd1f1de35be0f60d8be1e448d985: Status 404 returned error can't find the container with id 1239d644daa5c3a9547850377e9071674329bd1f1de35be0f60d8be1e448d985 Oct 02 11:40:44 crc kubenswrapper[4658]: I1002 11:40:44.690891 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c6406a7e-4303-43ed-bb07-2816e29af04c","Type":"ContainerStarted","Data":"c923fdb1a0acb6961ddbb926880082e8a0a82ec49562f4d27b5a39eef85a44cf"} Oct 02 11:40:44 crc kubenswrapper[4658]: I1002 11:40:44.693334 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a129e57-376b-4bc6-8d0c-c667d692d487","Type":"ContainerStarted","Data":"1239d644daa5c3a9547850377e9071674329bd1f1de35be0f60d8be1e448d985"} Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.008274 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-9w46d"] Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.010379 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.015555 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.039140 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-9w46d"] Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.137654 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-svc\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.137730 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.137793 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.137821 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-config\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.137882 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.137910 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx9w6\" (UniqueName: \"kubernetes.io/projected/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-kube-api-access-qx9w6\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.137956 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.239924 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-svc\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.239996 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.240057 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.240087 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-config\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.240259 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.240305 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx9w6\" (UniqueName: \"kubernetes.io/projected/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-kube-api-access-qx9w6\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.240360 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.241576 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-svc\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.241669 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.243118 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-config\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.243541 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.243654 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.245091 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.278454 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx9w6\" (UniqueName: \"kubernetes.io/projected/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-kube-api-access-qx9w6\") pod \"dnsmasq-dns-67b789f86c-9w46d\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.335363 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.723395 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a129e57-376b-4bc6-8d0c-c667d692d487","Type":"ContainerStarted","Data":"26c261d033f99db9dbb230fb198f47647fc942a67376e65022f4abc16696fb98"} Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.727838 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c6406a7e-4303-43ed-bb07-2816e29af04c","Type":"ContainerStarted","Data":"9372c5eb32731cfc84a3a811d0637a77b6aa49ec0a275ba5d8f8f994499fbf99"} Oct 02 11:40:46 crc kubenswrapper[4658]: I1002 11:40:46.880805 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-9w46d"] Oct 02 11:40:46 crc kubenswrapper[4658]: W1002 11:40:46.900854 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd11a1dcf_5039_4f8f_b8f6_6434f25f247f.slice/crio-2810576c44da0de03bd96cb711550f6151c95c47c48ca0aab7361dddd3402c99 WatchSource:0}: Error finding container 2810576c44da0de03bd96cb711550f6151c95c47c48ca0aab7361dddd3402c99: Status 404 returned error can't find the container with id 2810576c44da0de03bd96cb711550f6151c95c47c48ca0aab7361dddd3402c99 Oct 02 11:40:47 crc kubenswrapper[4658]: I1002 11:40:47.738527 4658 generic.go:334] "Generic (PLEG): container finished" podID="d11a1dcf-5039-4f8f-b8f6-6434f25f247f" containerID="76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627" exitCode=0 Oct 02 11:40:47 crc kubenswrapper[4658]: I1002 11:40:47.738653 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" event={"ID":"d11a1dcf-5039-4f8f-b8f6-6434f25f247f","Type":"ContainerDied","Data":"76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627"} Oct 02 11:40:47 crc kubenswrapper[4658]: I1002 11:40:47.739137 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" event={"ID":"d11a1dcf-5039-4f8f-b8f6-6434f25f247f","Type":"ContainerStarted","Data":"2810576c44da0de03bd96cb711550f6151c95c47c48ca0aab7361dddd3402c99"} Oct 02 11:40:48 crc kubenswrapper[4658]: I1002 11:40:48.752112 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" event={"ID":"d11a1dcf-5039-4f8f-b8f6-6434f25f247f","Type":"ContainerStarted","Data":"317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af"} Oct 02 11:40:48 crc kubenswrapper[4658]: I1002 11:40:48.752413 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:48 crc kubenswrapper[4658]: I1002 11:40:48.776067 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" podStartSLOduration=3.77604818 podStartE2EDuration="3.77604818s" podCreationTimestamp="2025-10-02 11:40:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:40:48.77204862 +0000 UTC m=+1329.663202227" watchObservedRunningTime="2025-10-02 11:40:48.77604818 +0000 UTC m=+1329.667201747" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.337684 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.433766 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fd56n"] Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.433998 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" podUID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" containerName="dnsmasq-dns" containerID="cri-o://723d3ae1b72510b7c1e6654dd8d2ab655ec4633e7062a03758012a249d06f7d6" gracePeriod=10 Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.586757 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bcf8b9d95-6qc52"] Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.589593 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.617421 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bcf8b9d95-6qc52"] Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.674974 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.675058 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-dns-swift-storage-0\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.675085 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntfdr\" (UniqueName: \"kubernetes.io/projected/d2ab47cf-8dcb-4517-b4de-a064181594e0-kube-api-access-ntfdr\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.675121 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-config\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.675145 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-dns-svc\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.675182 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.675216 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-openstack-edpm-ipam\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.777053 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.777111 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-openstack-edpm-ipam\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.777218 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.777265 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-dns-swift-storage-0\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.777285 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntfdr\" (UniqueName: \"kubernetes.io/projected/d2ab47cf-8dcb-4517-b4de-a064181594e0-kube-api-access-ntfdr\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.777334 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-config\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.777358 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-dns-svc\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.778237 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-openstack-edpm-ipam\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.778237 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.778288 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-dns-swift-storage-0\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.778489 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-config\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.778593 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.779170 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2ab47cf-8dcb-4517-b4de-a064181594e0-dns-svc\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.813510 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntfdr\" (UniqueName: \"kubernetes.io/projected/d2ab47cf-8dcb-4517-b4de-a064181594e0-kube-api-access-ntfdr\") pod \"dnsmasq-dns-6bcf8b9d95-6qc52\" (UID: \"d2ab47cf-8dcb-4517-b4de-a064181594e0\") " pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.831105 4658 generic.go:334] "Generic (PLEG): container finished" podID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" containerID="723d3ae1b72510b7c1e6654dd8d2ab655ec4633e7062a03758012a249d06f7d6" exitCode=0 Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.831214 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" event={"ID":"0a3249d7-8466-4f22-ba34-a4d6533e1de4","Type":"ContainerDied","Data":"723d3ae1b72510b7c1e6654dd8d2ab655ec4633e7062a03758012a249d06f7d6"} Oct 02 11:40:56 crc kubenswrapper[4658]: I1002 11:40:56.929900 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.054030 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.084714 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-svc\") pod \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.084884 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-swift-storage-0\") pod \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.084931 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-config\") pod \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.084976 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-sb\") pod \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.085015 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-nb\") pod \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.085131 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdcfw\" (UniqueName: \"kubernetes.io/projected/0a3249d7-8466-4f22-ba34-a4d6533e1de4-kube-api-access-sdcfw\") pod \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\" (UID: \"0a3249d7-8466-4f22-ba34-a4d6533e1de4\") " Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.109032 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3249d7-8466-4f22-ba34-a4d6533e1de4-kube-api-access-sdcfw" (OuterVolumeSpecName: "kube-api-access-sdcfw") pod "0a3249d7-8466-4f22-ba34-a4d6533e1de4" (UID: "0a3249d7-8466-4f22-ba34-a4d6533e1de4"). InnerVolumeSpecName "kube-api-access-sdcfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.166484 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-config" (OuterVolumeSpecName: "config") pod "0a3249d7-8466-4f22-ba34-a4d6533e1de4" (UID: "0a3249d7-8466-4f22-ba34-a4d6533e1de4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.178508 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a3249d7-8466-4f22-ba34-a4d6533e1de4" (UID: "0a3249d7-8466-4f22-ba34-a4d6533e1de4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.181696 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0a3249d7-8466-4f22-ba34-a4d6533e1de4" (UID: "0a3249d7-8466-4f22-ba34-a4d6533e1de4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.187712 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.187744 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.187753 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.187762 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdcfw\" (UniqueName: \"kubernetes.io/projected/0a3249d7-8466-4f22-ba34-a4d6533e1de4-kube-api-access-sdcfw\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.197156 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a3249d7-8466-4f22-ba34-a4d6533e1de4" (UID: "0a3249d7-8466-4f22-ba34-a4d6533e1de4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.211711 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a3249d7-8466-4f22-ba34-a4d6533e1de4" (UID: "0a3249d7-8466-4f22-ba34-a4d6533e1de4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.300251 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.300334 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a3249d7-8466-4f22-ba34-a4d6533e1de4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.429909 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.429986 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.449621 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bcf8b9d95-6qc52"] Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.847000 4658 generic.go:334] "Generic (PLEG): container finished" podID="d2ab47cf-8dcb-4517-b4de-a064181594e0" containerID="3fd8bb3aa792596a4b9317c995f570820258df809de7121ee07561b5672ece1d" exitCode=0 Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.847378 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" event={"ID":"d2ab47cf-8dcb-4517-b4de-a064181594e0","Type":"ContainerDied","Data":"3fd8bb3aa792596a4b9317c995f570820258df809de7121ee07561b5672ece1d"} Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.847499 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" event={"ID":"d2ab47cf-8dcb-4517-b4de-a064181594e0","Type":"ContainerStarted","Data":"4a187a10d57261c7fbdbeea84d0afb77621161878d8622e0d0afcdc510f00019"} Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.852531 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" event={"ID":"0a3249d7-8466-4f22-ba34-a4d6533e1de4","Type":"ContainerDied","Data":"4b8d2ccb543adb1f6116aa4b8a8afb33388dba72fd61aa94045c4911ea6aff42"} Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.852575 4658 scope.go:117] "RemoveContainer" containerID="723d3ae1b72510b7c1e6654dd8d2ab655ec4633e7062a03758012a249d06f7d6" Oct 02 11:40:57 crc kubenswrapper[4658]: I1002 11:40:57.852747 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:40:58 crc kubenswrapper[4658]: I1002 11:40:58.006444 4658 scope.go:117] "RemoveContainer" containerID="0b531ab7d9624e0654f1ab4dd15e708611fb16825891ff85c69fbf1c0579ba63" Oct 02 11:40:58 crc kubenswrapper[4658]: I1002 11:40:58.863809 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" event={"ID":"d2ab47cf-8dcb-4517-b4de-a064181594e0","Type":"ContainerStarted","Data":"f19964edc569e7282af31e9e232d0b452db462ed3b753198d78b87a4114a87cf"} Oct 02 11:40:58 crc kubenswrapper[4658]: I1002 11:40:58.864188 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:40:58 crc kubenswrapper[4658]: I1002 11:40:58.894325 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" podStartSLOduration=2.8943030309999997 podStartE2EDuration="2.894303031s" podCreationTimestamp="2025-10-02 11:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:40:58.882094905 +0000 UTC m=+1339.773248482" watchObservedRunningTime="2025-10-02 11:40:58.894303031 +0000 UTC m=+1339.785456598" Oct 02 11:41:06 crc kubenswrapper[4658]: I1002 11:41:06.931487 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bcf8b9d95-6qc52" Oct 02 11:41:06 crc kubenswrapper[4658]: I1002 11:41:06.985859 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-9w46d"] Oct 02 11:41:06 crc kubenswrapper[4658]: I1002 11:41:06.986385 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" podUID="d11a1dcf-5039-4f8f-b8f6-6434f25f247f" containerName="dnsmasq-dns" containerID="cri-o://317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af" gracePeriod=10 Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.563706 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.625508 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-config\") pod \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.625687 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-sb\") pod \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.625738 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-svc\") pod \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.625757 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-nb\") pod \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.625801 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-openstack-edpm-ipam\") pod \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.625843 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-swift-storage-0\") pod \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.625913 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx9w6\" (UniqueName: \"kubernetes.io/projected/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-kube-api-access-qx9w6\") pod \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\" (UID: \"d11a1dcf-5039-4f8f-b8f6-6434f25f247f\") " Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.643669 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-kube-api-access-qx9w6" (OuterVolumeSpecName: "kube-api-access-qx9w6") pod "d11a1dcf-5039-4f8f-b8f6-6434f25f247f" (UID: "d11a1dcf-5039-4f8f-b8f6-6434f25f247f"). InnerVolumeSpecName "kube-api-access-qx9w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.705687 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d11a1dcf-5039-4f8f-b8f6-6434f25f247f" (UID: "d11a1dcf-5039-4f8f-b8f6-6434f25f247f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.706073 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d11a1dcf-5039-4f8f-b8f6-6434f25f247f" (UID: "d11a1dcf-5039-4f8f-b8f6-6434f25f247f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.706326 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-config" (OuterVolumeSpecName: "config") pod "d11a1dcf-5039-4f8f-b8f6-6434f25f247f" (UID: "d11a1dcf-5039-4f8f-b8f6-6434f25f247f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.728878 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx9w6\" (UniqueName: \"kubernetes.io/projected/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-kube-api-access-qx9w6\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.728917 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.728930 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.728941 4658 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.736839 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d11a1dcf-5039-4f8f-b8f6-6434f25f247f" (UID: "d11a1dcf-5039-4f8f-b8f6-6434f25f247f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.763131 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d11a1dcf-5039-4f8f-b8f6-6434f25f247f" (UID: "d11a1dcf-5039-4f8f-b8f6-6434f25f247f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.763558 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d11a1dcf-5039-4f8f-b8f6-6434f25f247f" (UID: "d11a1dcf-5039-4f8f-b8f6-6434f25f247f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.831054 4658 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.831108 4658 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.831117 4658 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d11a1dcf-5039-4f8f-b8f6-6434f25f247f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.974514 4658 generic.go:334] "Generic (PLEG): container finished" podID="d11a1dcf-5039-4f8f-b8f6-6434f25f247f" containerID="317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af" exitCode=0 Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.975074 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" event={"ID":"d11a1dcf-5039-4f8f-b8f6-6434f25f247f","Type":"ContainerDied","Data":"317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af"} Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.975133 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.978626 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-9w46d" event={"ID":"d11a1dcf-5039-4f8f-b8f6-6434f25f247f","Type":"ContainerDied","Data":"2810576c44da0de03bd96cb711550f6151c95c47c48ca0aab7361dddd3402c99"} Oct 02 11:41:07 crc kubenswrapper[4658]: I1002 11:41:07.978657 4658 scope.go:117] "RemoveContainer" containerID="317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af" Oct 02 11:41:08 crc kubenswrapper[4658]: I1002 11:41:08.015541 4658 scope.go:117] "RemoveContainer" containerID="76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627" Oct 02 11:41:08 crc kubenswrapper[4658]: I1002 11:41:08.027085 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-9w46d"] Oct 02 11:41:08 crc kubenswrapper[4658]: I1002 11:41:08.040150 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-9w46d"] Oct 02 11:41:08 crc kubenswrapper[4658]: I1002 11:41:08.051858 4658 scope.go:117] "RemoveContainer" containerID="317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af" Oct 02 11:41:08 crc kubenswrapper[4658]: E1002 11:41:08.052406 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af\": container with ID starting with 317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af not found: ID does not exist" containerID="317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af" Oct 02 11:41:08 crc kubenswrapper[4658]: I1002 11:41:08.052445 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af"} err="failed to get container status \"317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af\": rpc error: code = NotFound desc = could not find container \"317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af\": container with ID starting with 317ae13ba6c51e70de87b5a68d8faf56b6d513c315be5a2c92295a43345f16af not found: ID does not exist" Oct 02 11:41:08 crc kubenswrapper[4658]: I1002 11:41:08.052473 4658 scope.go:117] "RemoveContainer" containerID="76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627" Oct 02 11:41:08 crc kubenswrapper[4658]: E1002 11:41:08.053145 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627\": container with ID starting with 76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627 not found: ID does not exist" containerID="76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627" Oct 02 11:41:08 crc kubenswrapper[4658]: I1002 11:41:08.053190 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627"} err="failed to get container status \"76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627\": rpc error: code = NotFound desc = could not find container \"76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627\": container with ID starting with 76a988d65207d4bf67d54e87675c3c366c15cc77571726a2d4e7e7d6d55aa627 not found: ID does not exist" Oct 02 11:41:09 crc kubenswrapper[4658]: I1002 11:41:09.961420 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11a1dcf-5039-4f8f-b8f6-6434f25f247f" path="/var/lib/kubelet/pods/d11a1dcf-5039-4f8f-b8f6-6434f25f247f/volumes" Oct 02 11:41:19 crc kubenswrapper[4658]: I1002 11:41:19.112835 4658 generic.go:334] "Generic (PLEG): container finished" podID="6a129e57-376b-4bc6-8d0c-c667d692d487" containerID="26c261d033f99db9dbb230fb198f47647fc942a67376e65022f4abc16696fb98" exitCode=0 Oct 02 11:41:19 crc kubenswrapper[4658]: I1002 11:41:19.112947 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a129e57-376b-4bc6-8d0c-c667d692d487","Type":"ContainerDied","Data":"26c261d033f99db9dbb230fb198f47647fc942a67376e65022f4abc16696fb98"} Oct 02 11:41:19 crc kubenswrapper[4658]: I1002 11:41:19.115606 4658 generic.go:334] "Generic (PLEG): container finished" podID="c6406a7e-4303-43ed-bb07-2816e29af04c" containerID="9372c5eb32731cfc84a3a811d0637a77b6aa49ec0a275ba5d8f8f994499fbf99" exitCode=0 Oct 02 11:41:19 crc kubenswrapper[4658]: I1002 11:41:19.115647 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c6406a7e-4303-43ed-bb07-2816e29af04c","Type":"ContainerDied","Data":"9372c5eb32731cfc84a3a811d0637a77b6aa49ec0a275ba5d8f8f994499fbf99"} Oct 02 11:41:20 crc kubenswrapper[4658]: I1002 11:41:20.127153 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c6406a7e-4303-43ed-bb07-2816e29af04c","Type":"ContainerStarted","Data":"75d30b6de6bfe1e94e93ef3037df92f80a6d4ca3735869dd7555f21095f52b3a"} Oct 02 11:41:20 crc kubenswrapper[4658]: I1002 11:41:20.128211 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:41:20 crc kubenswrapper[4658]: I1002 11:41:20.130888 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a129e57-376b-4bc6-8d0c-c667d692d487","Type":"ContainerStarted","Data":"429ac794b39aaa0ed919f8f05746f7508c508df2284bde3cdb2a7ec53fe81589"} Oct 02 11:41:20 crc kubenswrapper[4658]: I1002 11:41:20.131193 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 11:41:20 crc kubenswrapper[4658]: I1002 11:41:20.170263 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.170242496 podStartE2EDuration="37.170242496s" podCreationTimestamp="2025-10-02 11:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:41:20.158710252 +0000 UTC m=+1361.049863829" watchObservedRunningTime="2025-10-02 11:41:20.170242496 +0000 UTC m=+1361.061396063" Oct 02 11:41:21 crc kubenswrapper[4658]: I1002 11:41:21.049877 4658 scope.go:117] "RemoveContainer" containerID="608efdb34234573e4bb1f230ad3e2310cbd43b125bc7a055df498c4cd6d8fb4d" Oct 02 11:41:21 crc kubenswrapper[4658]: I1002 11:41:21.080606 4658 scope.go:117] "RemoveContainer" containerID="b00b7e97de47c65ec16441dcc377ff098c48bccf44e592c0aaad90cdd26ed224" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.842097 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.842072984 podStartE2EDuration="42.842072984s" podCreationTimestamp="2025-10-02 11:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:41:20.190890387 +0000 UTC m=+1361.082043954" watchObservedRunningTime="2025-10-02 11:41:25.842072984 +0000 UTC m=+1366.733226551" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.845066 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb"] Oct 02 11:41:25 crc kubenswrapper[4658]: E1002 11:41:25.845462 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" containerName="dnsmasq-dns" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.845474 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" containerName="dnsmasq-dns" Oct 02 11:41:25 crc kubenswrapper[4658]: E1002 11:41:25.845493 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11a1dcf-5039-4f8f-b8f6-6434f25f247f" containerName="dnsmasq-dns" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.845499 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11a1dcf-5039-4f8f-b8f6-6434f25f247f" containerName="dnsmasq-dns" Oct 02 11:41:25 crc kubenswrapper[4658]: E1002 11:41:25.845525 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11a1dcf-5039-4f8f-b8f6-6434f25f247f" containerName="init" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.845532 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11a1dcf-5039-4f8f-b8f6-6434f25f247f" containerName="init" Oct 02 11:41:25 crc kubenswrapper[4658]: E1002 11:41:25.845555 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" containerName="init" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.845560 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" containerName="init" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.845734 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" containerName="dnsmasq-dns" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.845750 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11a1dcf-5039-4f8f-b8f6-6434f25f247f" containerName="dnsmasq-dns" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.846409 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.855762 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb"] Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.882001 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.882017 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.882028 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.882001 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.996222 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.996507 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.996780 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4h2s\" (UniqueName: \"kubernetes.io/projected/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-kube-api-access-f4h2s\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:25 crc kubenswrapper[4658]: I1002 11:41:25.996943 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:26 crc kubenswrapper[4658]: I1002 11:41:26.099177 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4h2s\" (UniqueName: \"kubernetes.io/projected/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-kube-api-access-f4h2s\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:26 crc kubenswrapper[4658]: I1002 11:41:26.099811 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:26 crc kubenswrapper[4658]: I1002 11:41:26.100741 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:26 crc kubenswrapper[4658]: I1002 11:41:26.100870 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:26 crc kubenswrapper[4658]: I1002 11:41:26.105985 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:26 crc kubenswrapper[4658]: I1002 11:41:26.106944 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:26 crc kubenswrapper[4658]: I1002 11:41:26.119626 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:26 crc kubenswrapper[4658]: I1002 11:41:26.119680 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4h2s\" (UniqueName: \"kubernetes.io/projected/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-kube-api-access-f4h2s\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:26 crc kubenswrapper[4658]: I1002 11:41:26.198713 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:27 crc kubenswrapper[4658]: I1002 11:41:27.038365 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb"] Oct 02 11:41:27 crc kubenswrapper[4658]: I1002 11:41:27.244599 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" event={"ID":"4dbacd18-944b-4b5f-be12-5ac2c1cb163a","Type":"ContainerStarted","Data":"ac62d47810426f81788d71940af1bedfc67833d7752044aa412d7f34387328f4"} Oct 02 11:41:27 crc kubenswrapper[4658]: I1002 11:41:27.429991 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:41:27 crc kubenswrapper[4658]: I1002 11:41:27.430066 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:41:27 crc kubenswrapper[4658]: I1002 11:41:27.430123 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:41:27 crc kubenswrapper[4658]: I1002 11:41:27.430785 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"291f0b40b657899a41b0a5366c5b61d4ebf6b86816e301bb8cd5cf300e7b2e11"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:41:27 crc kubenswrapper[4658]: I1002 11:41:27.430842 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://291f0b40b657899a41b0a5366c5b61d4ebf6b86816e301bb8cd5cf300e7b2e11" gracePeriod=600 Oct 02 11:41:28 crc kubenswrapper[4658]: I1002 11:41:28.022657 4658 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod0a3249d7-8466-4f22-ba34-a4d6533e1de4"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod0a3249d7-8466-4f22-ba34-a4d6533e1de4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0a3249d7_8466_4f22_ba34_a4d6533e1de4.slice" Oct 02 11:41:28 crc kubenswrapper[4658]: E1002 11:41:28.022991 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod0a3249d7-8466-4f22-ba34-a4d6533e1de4] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod0a3249d7-8466-4f22-ba34-a4d6533e1de4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0a3249d7_8466_4f22_ba34_a4d6533e1de4.slice" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" podUID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" Oct 02 11:41:28 crc kubenswrapper[4658]: I1002 11:41:28.264019 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="291f0b40b657899a41b0a5366c5b61d4ebf6b86816e301bb8cd5cf300e7b2e11" exitCode=0 Oct 02 11:41:28 crc kubenswrapper[4658]: I1002 11:41:28.264106 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"291f0b40b657899a41b0a5366c5b61d4ebf6b86816e301bb8cd5cf300e7b2e11"} Oct 02 11:41:28 crc kubenswrapper[4658]: I1002 11:41:28.264317 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fd56n" Oct 02 11:41:28 crc kubenswrapper[4658]: I1002 11:41:28.264346 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f"} Oct 02 11:41:28 crc kubenswrapper[4658]: I1002 11:41:28.264374 4658 scope.go:117] "RemoveContainer" containerID="070d9ca89b2be9f5cb302e4464d452f6af7427a486ef0fedb26718058c812952" Oct 02 11:41:28 crc kubenswrapper[4658]: I1002 11:41:28.339516 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fd56n"] Oct 02 11:41:28 crc kubenswrapper[4658]: I1002 11:41:28.354541 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fd56n"] Oct 02 11:41:29 crc kubenswrapper[4658]: I1002 11:41:29.965224 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3249d7-8466-4f22-ba34-a4d6533e1de4" path="/var/lib/kubelet/pods/0a3249d7-8466-4f22-ba34-a4d6533e1de4/volumes" Oct 02 11:41:33 crc kubenswrapper[4658]: I1002 11:41:33.515475 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:41:34 crc kubenswrapper[4658]: I1002 11:41:34.127504 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 11:41:36 crc kubenswrapper[4658]: I1002 11:41:36.347170 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" event={"ID":"4dbacd18-944b-4b5f-be12-5ac2c1cb163a","Type":"ContainerStarted","Data":"2c011066c0be3bd7a7caf21451c62aa251e9af8f31b8fcf663a17fa38cc58ea4"} Oct 02 11:41:36 crc kubenswrapper[4658]: I1002 11:41:36.375015 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" podStartSLOduration=2.35996337 podStartE2EDuration="11.374997579s" podCreationTimestamp="2025-10-02 11:41:25 +0000 UTC" firstStartedPulling="2025-10-02 11:41:27.041009698 +0000 UTC m=+1367.932163265" lastFinishedPulling="2025-10-02 11:41:36.056043907 +0000 UTC m=+1376.947197474" observedRunningTime="2025-10-02 11:41:36.36359884 +0000 UTC m=+1377.254752407" watchObservedRunningTime="2025-10-02 11:41:36.374997579 +0000 UTC m=+1377.266151146" Oct 02 11:41:48 crc kubenswrapper[4658]: I1002 11:41:48.468444 4658 generic.go:334] "Generic (PLEG): container finished" podID="4dbacd18-944b-4b5f-be12-5ac2c1cb163a" containerID="2c011066c0be3bd7a7caf21451c62aa251e9af8f31b8fcf663a17fa38cc58ea4" exitCode=0 Oct 02 11:41:48 crc kubenswrapper[4658]: I1002 11:41:48.468488 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" event={"ID":"4dbacd18-944b-4b5f-be12-5ac2c1cb163a","Type":"ContainerDied","Data":"2c011066c0be3bd7a7caf21451c62aa251e9af8f31b8fcf663a17fa38cc58ea4"} Oct 02 11:41:49 crc kubenswrapper[4658]: I1002 11:41:49.971026 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.060989 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-inventory\") pod \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.061088 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-repo-setup-combined-ca-bundle\") pod \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.061107 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-ssh-key\") pod \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.061149 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4h2s\" (UniqueName: \"kubernetes.io/projected/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-kube-api-access-f4h2s\") pod \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\" (UID: \"4dbacd18-944b-4b5f-be12-5ac2c1cb163a\") " Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.066740 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-kube-api-access-f4h2s" (OuterVolumeSpecName: "kube-api-access-f4h2s") pod "4dbacd18-944b-4b5f-be12-5ac2c1cb163a" (UID: "4dbacd18-944b-4b5f-be12-5ac2c1cb163a"). InnerVolumeSpecName "kube-api-access-f4h2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.066811 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4dbacd18-944b-4b5f-be12-5ac2c1cb163a" (UID: "4dbacd18-944b-4b5f-be12-5ac2c1cb163a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.088427 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-inventory" (OuterVolumeSpecName: "inventory") pod "4dbacd18-944b-4b5f-be12-5ac2c1cb163a" (UID: "4dbacd18-944b-4b5f-be12-5ac2c1cb163a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.089781 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4dbacd18-944b-4b5f-be12-5ac2c1cb163a" (UID: "4dbacd18-944b-4b5f-be12-5ac2c1cb163a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.163527 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4h2s\" (UniqueName: \"kubernetes.io/projected/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-kube-api-access-f4h2s\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.163561 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.163571 4658 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.163583 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dbacd18-944b-4b5f-be12-5ac2c1cb163a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.490503 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" event={"ID":"4dbacd18-944b-4b5f-be12-5ac2c1cb163a","Type":"ContainerDied","Data":"ac62d47810426f81788d71940af1bedfc67833d7752044aa412d7f34387328f4"} Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.490548 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac62d47810426f81788d71940af1bedfc67833d7752044aa412d7f34387328f4" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.490550 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.549582 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7"] Oct 02 11:41:50 crc kubenswrapper[4658]: E1002 11:41:50.550001 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbacd18-944b-4b5f-be12-5ac2c1cb163a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.550017 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbacd18-944b-4b5f-be12-5ac2c1cb163a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.550225 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbacd18-944b-4b5f-be12-5ac2c1cb163a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.551039 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.552661 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.553270 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.554410 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.556190 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.572005 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7"] Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.673395 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwgw7\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.673716 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwgw7\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.673776 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xq54\" (UniqueName: \"kubernetes.io/projected/270f59c2-b21f-4b38-821c-5c1b4ce0be21-kube-api-access-6xq54\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwgw7\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.775918 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xq54\" (UniqueName: \"kubernetes.io/projected/270f59c2-b21f-4b38-821c-5c1b4ce0be21-kube-api-access-6xq54\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwgw7\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.776013 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwgw7\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.776149 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwgw7\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.781338 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwgw7\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.784148 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwgw7\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.792248 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xq54\" (UniqueName: \"kubernetes.io/projected/270f59c2-b21f-4b38-821c-5c1b4ce0be21-kube-api-access-6xq54\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwgw7\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:50 crc kubenswrapper[4658]: I1002 11:41:50.870661 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:51 crc kubenswrapper[4658]: I1002 11:41:51.389688 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7"] Oct 02 11:41:51 crc kubenswrapper[4658]: W1002 11:41:51.396513 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod270f59c2_b21f_4b38_821c_5c1b4ce0be21.slice/crio-7ed47df4ca215f00c134f83bb37d7fa62c90283b587385a9bfbd97667fe05909 WatchSource:0}: Error finding container 7ed47df4ca215f00c134f83bb37d7fa62c90283b587385a9bfbd97667fe05909: Status 404 returned error can't find the container with id 7ed47df4ca215f00c134f83bb37d7fa62c90283b587385a9bfbd97667fe05909 Oct 02 11:41:51 crc kubenswrapper[4658]: I1002 11:41:51.500492 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" event={"ID":"270f59c2-b21f-4b38-821c-5c1b4ce0be21","Type":"ContainerStarted","Data":"7ed47df4ca215f00c134f83bb37d7fa62c90283b587385a9bfbd97667fe05909"} Oct 02 11:41:52 crc kubenswrapper[4658]: I1002 11:41:52.524328 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" event={"ID":"270f59c2-b21f-4b38-821c-5c1b4ce0be21","Type":"ContainerStarted","Data":"3a019aad8725d6626df26f443191579b1b5d9fb922c488d9471a100fbe133d89"} Oct 02 11:41:52 crc kubenswrapper[4658]: I1002 11:41:52.559628 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" podStartSLOduration=2.044808352 podStartE2EDuration="2.559607058s" podCreationTimestamp="2025-10-02 11:41:50 +0000 UTC" firstStartedPulling="2025-10-02 11:41:51.398483143 +0000 UTC m=+1392.289636710" lastFinishedPulling="2025-10-02 11:41:51.913281859 +0000 UTC m=+1392.804435416" observedRunningTime="2025-10-02 11:41:52.542878684 +0000 UTC m=+1393.434032261" watchObservedRunningTime="2025-10-02 11:41:52.559607058 +0000 UTC m=+1393.450760625" Oct 02 11:41:55 crc kubenswrapper[4658]: I1002 11:41:55.554805 4658 generic.go:334] "Generic (PLEG): container finished" podID="270f59c2-b21f-4b38-821c-5c1b4ce0be21" containerID="3a019aad8725d6626df26f443191579b1b5d9fb922c488d9471a100fbe133d89" exitCode=0 Oct 02 11:41:55 crc kubenswrapper[4658]: I1002 11:41:55.554910 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" event={"ID":"270f59c2-b21f-4b38-821c-5c1b4ce0be21","Type":"ContainerDied","Data":"3a019aad8725d6626df26f443191579b1b5d9fb922c488d9471a100fbe133d89"} Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.029600 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.223092 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-ssh-key\") pod \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.223262 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xq54\" (UniqueName: \"kubernetes.io/projected/270f59c2-b21f-4b38-821c-5c1b4ce0be21-kube-api-access-6xq54\") pod \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.223465 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-inventory\") pod \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\" (UID: \"270f59c2-b21f-4b38-821c-5c1b4ce0be21\") " Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.229540 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270f59c2-b21f-4b38-821c-5c1b4ce0be21-kube-api-access-6xq54" (OuterVolumeSpecName: "kube-api-access-6xq54") pod "270f59c2-b21f-4b38-821c-5c1b4ce0be21" (UID: "270f59c2-b21f-4b38-821c-5c1b4ce0be21"). InnerVolumeSpecName "kube-api-access-6xq54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.259235 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-inventory" (OuterVolumeSpecName: "inventory") pod "270f59c2-b21f-4b38-821c-5c1b4ce0be21" (UID: "270f59c2-b21f-4b38-821c-5c1b4ce0be21"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.259273 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "270f59c2-b21f-4b38-821c-5c1b4ce0be21" (UID: "270f59c2-b21f-4b38-821c-5c1b4ce0be21"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.326376 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.326439 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270f59c2-b21f-4b38-821c-5c1b4ce0be21-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.326457 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xq54\" (UniqueName: \"kubernetes.io/projected/270f59c2-b21f-4b38-821c-5c1b4ce0be21-kube-api-access-6xq54\") on node \"crc\" DevicePath \"\"" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.576932 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" event={"ID":"270f59c2-b21f-4b38-821c-5c1b4ce0be21","Type":"ContainerDied","Data":"7ed47df4ca215f00c134f83bb37d7fa62c90283b587385a9bfbd97667fe05909"} Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.577192 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ed47df4ca215f00c134f83bb37d7fa62c90283b587385a9bfbd97667fe05909" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.577013 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwgw7" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.661286 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx"] Oct 02 11:41:57 crc kubenswrapper[4658]: E1002 11:41:57.661759 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270f59c2-b21f-4b38-821c-5c1b4ce0be21" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.661778 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="270f59c2-b21f-4b38-821c-5c1b4ce0be21" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.662025 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="270f59c2-b21f-4b38-821c-5c1b4ce0be21" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.662754 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.668988 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.669069 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.669161 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.677670 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.685707 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx"] Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.838353 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brfh7\" (UniqueName: \"kubernetes.io/projected/3e768ea4-04c3-4825-9431-a37f41f34a01-kube-api-access-brfh7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.838424 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.838602 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.838638 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.940515 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brfh7\" (UniqueName: \"kubernetes.io/projected/3e768ea4-04c3-4825-9431-a37f41f34a01-kube-api-access-brfh7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.940839 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.941073 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.941200 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.946049 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.946236 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.946650 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.956360 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brfh7\" (UniqueName: \"kubernetes.io/projected/3e768ea4-04c3-4825-9431-a37f41f34a01-kube-api-access-brfh7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:57 crc kubenswrapper[4658]: I1002 11:41:57.982147 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:41:58 crc kubenswrapper[4658]: I1002 11:41:58.536422 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx"] Oct 02 11:41:58 crc kubenswrapper[4658]: I1002 11:41:58.587618 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" event={"ID":"3e768ea4-04c3-4825-9431-a37f41f34a01","Type":"ContainerStarted","Data":"437b8fccd561715cdc2047437c3cb33a509139966c0ee8ea7d62b67f3f6ececb"} Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.605575 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" event={"ID":"3e768ea4-04c3-4825-9431-a37f41f34a01","Type":"ContainerStarted","Data":"8d3d6fa6fdeab877df7f3591d4fe5a45fd6361963bcfd6ad52a8e6a69f257b9c"} Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.641842 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" podStartSLOduration=2.146791977 podStartE2EDuration="2.6418206s" podCreationTimestamp="2025-10-02 11:41:57 +0000 UTC" firstStartedPulling="2025-10-02 11:41:58.535222936 +0000 UTC m=+1399.426376503" lastFinishedPulling="2025-10-02 11:41:59.030251559 +0000 UTC m=+1399.921405126" observedRunningTime="2025-10-02 11:41:59.627393051 +0000 UTC m=+1400.518546618" watchObservedRunningTime="2025-10-02 11:41:59.6418206 +0000 UTC m=+1400.532974167" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.700942 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zlh9j"] Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.702847 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.712385 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zlh9j"] Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.780927 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-utilities\") pod \"redhat-operators-zlh9j\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.781145 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhv7p\" (UniqueName: \"kubernetes.io/projected/c64025d0-2897-4f95-961c-b41f0f84c6a4-kube-api-access-bhv7p\") pod \"redhat-operators-zlh9j\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.781179 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-catalog-content\") pod \"redhat-operators-zlh9j\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.883762 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhv7p\" (UniqueName: \"kubernetes.io/projected/c64025d0-2897-4f95-961c-b41f0f84c6a4-kube-api-access-bhv7p\") pod \"redhat-operators-zlh9j\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.883845 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-catalog-content\") pod \"redhat-operators-zlh9j\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.883920 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-utilities\") pod \"redhat-operators-zlh9j\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.884586 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-catalog-content\") pod \"redhat-operators-zlh9j\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.884603 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-utilities\") pod \"redhat-operators-zlh9j\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:41:59 crc kubenswrapper[4658]: I1002 11:41:59.906741 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhv7p\" (UniqueName: \"kubernetes.io/projected/c64025d0-2897-4f95-961c-b41f0f84c6a4-kube-api-access-bhv7p\") pod \"redhat-operators-zlh9j\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:42:00 crc kubenswrapper[4658]: I1002 11:42:00.024596 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:42:00 crc kubenswrapper[4658]: I1002 11:42:00.520754 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zlh9j"] Oct 02 11:42:00 crc kubenswrapper[4658]: I1002 11:42:00.622138 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh9j" event={"ID":"c64025d0-2897-4f95-961c-b41f0f84c6a4","Type":"ContainerStarted","Data":"16b9ba09d577ceb3c2004bddca44c6bed556c849b34383dce35867c34b016f3a"} Oct 02 11:42:01 crc kubenswrapper[4658]: I1002 11:42:01.634814 4658 generic.go:334] "Generic (PLEG): container finished" podID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerID="e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f" exitCode=0 Oct 02 11:42:01 crc kubenswrapper[4658]: I1002 11:42:01.635107 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh9j" event={"ID":"c64025d0-2897-4f95-961c-b41f0f84c6a4","Type":"ContainerDied","Data":"e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f"} Oct 02 11:42:03 crc kubenswrapper[4658]: I1002 11:42:03.655637 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh9j" event={"ID":"c64025d0-2897-4f95-961c-b41f0f84c6a4","Type":"ContainerStarted","Data":"2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075"} Oct 02 11:42:04 crc kubenswrapper[4658]: I1002 11:42:04.668981 4658 generic.go:334] "Generic (PLEG): container finished" podID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerID="2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075" exitCode=0 Oct 02 11:42:04 crc kubenswrapper[4658]: I1002 11:42:04.669087 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh9j" event={"ID":"c64025d0-2897-4f95-961c-b41f0f84c6a4","Type":"ContainerDied","Data":"2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075"} Oct 02 11:42:05 crc kubenswrapper[4658]: I1002 11:42:05.692574 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh9j" event={"ID":"c64025d0-2897-4f95-961c-b41f0f84c6a4","Type":"ContainerStarted","Data":"15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472"} Oct 02 11:42:05 crc kubenswrapper[4658]: I1002 11:42:05.718842 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zlh9j" podStartSLOduration=3.006346083 podStartE2EDuration="6.718818971s" podCreationTimestamp="2025-10-02 11:41:59 +0000 UTC" firstStartedPulling="2025-10-02 11:42:01.636939931 +0000 UTC m=+1402.528093488" lastFinishedPulling="2025-10-02 11:42:05.349412789 +0000 UTC m=+1406.240566376" observedRunningTime="2025-10-02 11:42:05.711267166 +0000 UTC m=+1406.602420733" watchObservedRunningTime="2025-10-02 11:42:05.718818971 +0000 UTC m=+1406.609972538" Oct 02 11:42:10 crc kubenswrapper[4658]: I1002 11:42:10.025453 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:42:10 crc kubenswrapper[4658]: I1002 11:42:10.025975 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:42:11 crc kubenswrapper[4658]: I1002 11:42:11.082824 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zlh9j" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerName="registry-server" probeResult="failure" output=< Oct 02 11:42:11 crc kubenswrapper[4658]: timeout: failed to connect service ":50051" within 1s Oct 02 11:42:11 crc kubenswrapper[4658]: > Oct 02 11:42:20 crc kubenswrapper[4658]: I1002 11:42:20.074630 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:42:20 crc kubenswrapper[4658]: I1002 11:42:20.127195 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:42:20 crc kubenswrapper[4658]: I1002 11:42:20.312003 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zlh9j"] Oct 02 11:42:21 crc kubenswrapper[4658]: I1002 11:42:21.235225 4658 scope.go:117] "RemoveContainer" containerID="bb8d1255886557c2f7bdfa62d0d469477dd7b6fb601d51a51307d11ed1e234a4" Oct 02 11:42:21 crc kubenswrapper[4658]: I1002 11:42:21.847798 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zlh9j" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerName="registry-server" containerID="cri-o://15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472" gracePeriod=2 Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.305375 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.429948 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-utilities\") pod \"c64025d0-2897-4f95-961c-b41f0f84c6a4\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.430010 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhv7p\" (UniqueName: \"kubernetes.io/projected/c64025d0-2897-4f95-961c-b41f0f84c6a4-kube-api-access-bhv7p\") pod \"c64025d0-2897-4f95-961c-b41f0f84c6a4\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.430033 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-catalog-content\") pod \"c64025d0-2897-4f95-961c-b41f0f84c6a4\" (UID: \"c64025d0-2897-4f95-961c-b41f0f84c6a4\") " Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.433810 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-utilities" (OuterVolumeSpecName: "utilities") pod "c64025d0-2897-4f95-961c-b41f0f84c6a4" (UID: "c64025d0-2897-4f95-961c-b41f0f84c6a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.436709 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c64025d0-2897-4f95-961c-b41f0f84c6a4-kube-api-access-bhv7p" (OuterVolumeSpecName: "kube-api-access-bhv7p") pod "c64025d0-2897-4f95-961c-b41f0f84c6a4" (UID: "c64025d0-2897-4f95-961c-b41f0f84c6a4"). InnerVolumeSpecName "kube-api-access-bhv7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.516076 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c64025d0-2897-4f95-961c-b41f0f84c6a4" (UID: "c64025d0-2897-4f95-961c-b41f0f84c6a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.532872 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.532938 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhv7p\" (UniqueName: \"kubernetes.io/projected/c64025d0-2897-4f95-961c-b41f0f84c6a4-kube-api-access-bhv7p\") on node \"crc\" DevicePath \"\"" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.532953 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64025d0-2897-4f95-961c-b41f0f84c6a4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.860881 4658 generic.go:334] "Generic (PLEG): container finished" podID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerID="15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472" exitCode=0 Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.860923 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh9j" event={"ID":"c64025d0-2897-4f95-961c-b41f0f84c6a4","Type":"ContainerDied","Data":"15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472"} Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.860942 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlh9j" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.860953 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh9j" event={"ID":"c64025d0-2897-4f95-961c-b41f0f84c6a4","Type":"ContainerDied","Data":"16b9ba09d577ceb3c2004bddca44c6bed556c849b34383dce35867c34b016f3a"} Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.860973 4658 scope.go:117] "RemoveContainer" containerID="15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.885333 4658 scope.go:117] "RemoveContainer" containerID="2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.899450 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zlh9j"] Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.907846 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zlh9j"] Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.913455 4658 scope.go:117] "RemoveContainer" containerID="e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.970065 4658 scope.go:117] "RemoveContainer" containerID="15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472" Oct 02 11:42:22 crc kubenswrapper[4658]: E1002 11:42:22.970583 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472\": container with ID starting with 15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472 not found: ID does not exist" containerID="15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.970631 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472"} err="failed to get container status \"15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472\": rpc error: code = NotFound desc = could not find container \"15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472\": container with ID starting with 15449ee653b074a2c50f42abbfa9550872722f69b4ecceffe0fd775badf5e472 not found: ID does not exist" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.970662 4658 scope.go:117] "RemoveContainer" containerID="2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075" Oct 02 11:42:22 crc kubenswrapper[4658]: E1002 11:42:22.973780 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075\": container with ID starting with 2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075 not found: ID does not exist" containerID="2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.973847 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075"} err="failed to get container status \"2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075\": rpc error: code = NotFound desc = could not find container \"2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075\": container with ID starting with 2cb45dad3b089fdb7c33fe77314997b57c1222a454b23d2d59c4fe5803660075 not found: ID does not exist" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.973886 4658 scope.go:117] "RemoveContainer" containerID="e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f" Oct 02 11:42:22 crc kubenswrapper[4658]: E1002 11:42:22.974733 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f\": container with ID starting with e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f not found: ID does not exist" containerID="e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f" Oct 02 11:42:22 crc kubenswrapper[4658]: I1002 11:42:22.974773 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f"} err="failed to get container status \"e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f\": rpc error: code = NotFound desc = could not find container \"e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f\": container with ID starting with e0ad75ca0118fbc62522b1a1c62bf1882a19b9205ca7fc3d521247450dede76f not found: ID does not exist" Oct 02 11:42:23 crc kubenswrapper[4658]: I1002 11:42:23.962684 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" path="/var/lib/kubelet/pods/c64025d0-2897-4f95-961c-b41f0f84c6a4/volumes" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.299978 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vjbwc"] Oct 02 11:43:13 crc kubenswrapper[4658]: E1002 11:43:13.301026 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerName="extract-utilities" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.301039 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerName="extract-utilities" Oct 02 11:43:13 crc kubenswrapper[4658]: E1002 11:43:13.301066 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerName="registry-server" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.301074 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerName="registry-server" Oct 02 11:43:13 crc kubenswrapper[4658]: E1002 11:43:13.301107 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerName="extract-content" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.301114 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerName="extract-content" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.301425 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64025d0-2897-4f95-961c-b41f0f84c6a4" containerName="registry-server" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.303043 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.316036 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vjbwc"] Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.388850 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-catalog-content\") pod \"certified-operators-vjbwc\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.388922 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pclt9\" (UniqueName: \"kubernetes.io/projected/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-kube-api-access-pclt9\") pod \"certified-operators-vjbwc\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.389260 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-utilities\") pod \"certified-operators-vjbwc\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.491550 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-catalog-content\") pod \"certified-operators-vjbwc\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.491629 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pclt9\" (UniqueName: \"kubernetes.io/projected/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-kube-api-access-pclt9\") pod \"certified-operators-vjbwc\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.491738 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-utilities\") pod \"certified-operators-vjbwc\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.492371 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-utilities\") pod \"certified-operators-vjbwc\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.492623 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-catalog-content\") pod \"certified-operators-vjbwc\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.512205 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pclt9\" (UniqueName: \"kubernetes.io/projected/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-kube-api-access-pclt9\") pod \"certified-operators-vjbwc\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:13 crc kubenswrapper[4658]: I1002 11:43:13.623477 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:14 crc kubenswrapper[4658]: I1002 11:43:14.165713 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vjbwc"] Oct 02 11:43:14 crc kubenswrapper[4658]: I1002 11:43:14.381343 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjbwc" event={"ID":"37d9421b-31fe-423e-9b90-6fa9a3a2ae98","Type":"ContainerStarted","Data":"918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e"} Oct 02 11:43:14 crc kubenswrapper[4658]: I1002 11:43:14.381393 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjbwc" event={"ID":"37d9421b-31fe-423e-9b90-6fa9a3a2ae98","Type":"ContainerStarted","Data":"8c60c7d700d662e5a84caffdc23cb4e47331c8b775929fa4ae581143207f4b25"} Oct 02 11:43:15 crc kubenswrapper[4658]: I1002 11:43:15.391392 4658 generic.go:334] "Generic (PLEG): container finished" podID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerID="918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e" exitCode=0 Oct 02 11:43:15 crc kubenswrapper[4658]: I1002 11:43:15.391482 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjbwc" event={"ID":"37d9421b-31fe-423e-9b90-6fa9a3a2ae98","Type":"ContainerDied","Data":"918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e"} Oct 02 11:43:16 crc kubenswrapper[4658]: I1002 11:43:16.404657 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjbwc" event={"ID":"37d9421b-31fe-423e-9b90-6fa9a3a2ae98","Type":"ContainerStarted","Data":"ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142"} Oct 02 11:43:17 crc kubenswrapper[4658]: I1002 11:43:17.415700 4658 generic.go:334] "Generic (PLEG): container finished" podID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerID="ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142" exitCode=0 Oct 02 11:43:17 crc kubenswrapper[4658]: I1002 11:43:17.416001 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjbwc" event={"ID":"37d9421b-31fe-423e-9b90-6fa9a3a2ae98","Type":"ContainerDied","Data":"ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142"} Oct 02 11:43:18 crc kubenswrapper[4658]: I1002 11:43:18.428257 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjbwc" event={"ID":"37d9421b-31fe-423e-9b90-6fa9a3a2ae98","Type":"ContainerStarted","Data":"329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2"} Oct 02 11:43:18 crc kubenswrapper[4658]: I1002 11:43:18.454695 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vjbwc" podStartSLOduration=3.00438083 podStartE2EDuration="5.454678203s" podCreationTimestamp="2025-10-02 11:43:13 +0000 UTC" firstStartedPulling="2025-10-02 11:43:15.394325093 +0000 UTC m=+1476.285478660" lastFinishedPulling="2025-10-02 11:43:17.844622466 +0000 UTC m=+1478.735776033" observedRunningTime="2025-10-02 11:43:18.448034695 +0000 UTC m=+1479.339188272" watchObservedRunningTime="2025-10-02 11:43:18.454678203 +0000 UTC m=+1479.345831770" Oct 02 11:43:21 crc kubenswrapper[4658]: I1002 11:43:21.314583 4658 scope.go:117] "RemoveContainer" containerID="8779db10fbef68937802a29919f9f9dd2cbddf39104c74c0fdd50ee885a5ca23" Oct 02 11:43:23 crc kubenswrapper[4658]: I1002 11:43:23.623953 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:23 crc kubenswrapper[4658]: I1002 11:43:23.624544 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:23 crc kubenswrapper[4658]: I1002 11:43:23.689945 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:24 crc kubenswrapper[4658]: I1002 11:43:24.536734 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:24 crc kubenswrapper[4658]: I1002 11:43:24.582497 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vjbwc"] Oct 02 11:43:26 crc kubenswrapper[4658]: I1002 11:43:26.509832 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vjbwc" podUID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerName="registry-server" containerID="cri-o://329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2" gracePeriod=2 Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.268565 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.429749 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.429812 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.465871 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pclt9\" (UniqueName: \"kubernetes.io/projected/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-kube-api-access-pclt9\") pod \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.465977 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-catalog-content\") pod \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.466238 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-utilities\") pod \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\" (UID: \"37d9421b-31fe-423e-9b90-6fa9a3a2ae98\") " Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.466971 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-utilities" (OuterVolumeSpecName: "utilities") pod "37d9421b-31fe-423e-9b90-6fa9a3a2ae98" (UID: "37d9421b-31fe-423e-9b90-6fa9a3a2ae98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.473315 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-kube-api-access-pclt9" (OuterVolumeSpecName: "kube-api-access-pclt9") pod "37d9421b-31fe-423e-9b90-6fa9a3a2ae98" (UID: "37d9421b-31fe-423e-9b90-6fa9a3a2ae98"). InnerVolumeSpecName "kube-api-access-pclt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.519278 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37d9421b-31fe-423e-9b90-6fa9a3a2ae98" (UID: "37d9421b-31fe-423e-9b90-6fa9a3a2ae98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.529212 4658 generic.go:334] "Generic (PLEG): container finished" podID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerID="329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2" exitCode=0 Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.529251 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjbwc" event={"ID":"37d9421b-31fe-423e-9b90-6fa9a3a2ae98","Type":"ContainerDied","Data":"329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2"} Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.529277 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjbwc" event={"ID":"37d9421b-31fe-423e-9b90-6fa9a3a2ae98","Type":"ContainerDied","Data":"8c60c7d700d662e5a84caffdc23cb4e47331c8b775929fa4ae581143207f4b25"} Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.529307 4658 scope.go:117] "RemoveContainer" containerID="329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.529450 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjbwc" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.567353 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vjbwc"] Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.568720 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.568747 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pclt9\" (UniqueName: \"kubernetes.io/projected/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-kube-api-access-pclt9\") on node \"crc\" DevicePath \"\"" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.568757 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d9421b-31fe-423e-9b90-6fa9a3a2ae98-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.570674 4658 scope.go:117] "RemoveContainer" containerID="ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.581620 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vjbwc"] Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.590838 4658 scope.go:117] "RemoveContainer" containerID="918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.658500 4658 scope.go:117] "RemoveContainer" containerID="329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2" Oct 02 11:43:27 crc kubenswrapper[4658]: E1002 11:43:27.659105 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2\": container with ID starting with 329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2 not found: ID does not exist" containerID="329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.659144 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2"} err="failed to get container status \"329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2\": rpc error: code = NotFound desc = could not find container \"329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2\": container with ID starting with 329e8eb612d6764d136ba866a42f5924d670e528dacc29a88a67009c219d21c2 not found: ID does not exist" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.659180 4658 scope.go:117] "RemoveContainer" containerID="ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142" Oct 02 11:43:27 crc kubenswrapper[4658]: E1002 11:43:27.659626 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142\": container with ID starting with ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142 not found: ID does not exist" containerID="ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.659648 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142"} err="failed to get container status \"ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142\": rpc error: code = NotFound desc = could not find container \"ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142\": container with ID starting with ba93ea3ad713431fb6beea867639e081148bd0d9a8a1d9ea85f596a0de24a142 not found: ID does not exist" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.659660 4658 scope.go:117] "RemoveContainer" containerID="918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e" Oct 02 11:43:27 crc kubenswrapper[4658]: E1002 11:43:27.659941 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e\": container with ID starting with 918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e not found: ID does not exist" containerID="918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.659964 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e"} err="failed to get container status \"918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e\": rpc error: code = NotFound desc = could not find container \"918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e\": container with ID starting with 918f91b0e4f7f8d56bfb23502dae059a538aacd04bdf62e84c5bf737f494528e not found: ID does not exist" Oct 02 11:43:27 crc kubenswrapper[4658]: I1002 11:43:27.974842 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" path="/var/lib/kubelet/pods/37d9421b-31fe-423e-9b90-6fa9a3a2ae98/volumes" Oct 02 11:43:57 crc kubenswrapper[4658]: I1002 11:43:57.429870 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:43:57 crc kubenswrapper[4658]: I1002 11:43:57.430443 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.304621 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8l6n7"] Oct 02 11:43:59 crc kubenswrapper[4658]: E1002 11:43:59.305707 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerName="extract-utilities" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.305725 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerName="extract-utilities" Oct 02 11:43:59 crc kubenswrapper[4658]: E1002 11:43:59.305754 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerName="extract-content" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.305761 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerName="extract-content" Oct 02 11:43:59 crc kubenswrapper[4658]: E1002 11:43:59.305786 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerName="registry-server" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.305795 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerName="registry-server" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.306021 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d9421b-31fe-423e-9b90-6fa9a3a2ae98" containerName="registry-server" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.307876 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.319401 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8l6n7"] Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.431780 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnffl\" (UniqueName: \"kubernetes.io/projected/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-kube-api-access-gnffl\") pod \"community-operators-8l6n7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.432003 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-utilities\") pod \"community-operators-8l6n7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.432049 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-catalog-content\") pod \"community-operators-8l6n7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.533766 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-catalog-content\") pod \"community-operators-8l6n7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.533916 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnffl\" (UniqueName: \"kubernetes.io/projected/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-kube-api-access-gnffl\") pod \"community-operators-8l6n7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.534026 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-utilities\") pod \"community-operators-8l6n7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.534448 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-catalog-content\") pod \"community-operators-8l6n7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.534562 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-utilities\") pod \"community-operators-8l6n7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.558744 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnffl\" (UniqueName: \"kubernetes.io/projected/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-kube-api-access-gnffl\") pod \"community-operators-8l6n7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:43:59 crc kubenswrapper[4658]: I1002 11:43:59.643227 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:44:00 crc kubenswrapper[4658]: I1002 11:44:00.214884 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8l6n7"] Oct 02 11:44:00 crc kubenswrapper[4658]: I1002 11:44:00.858890 4658 generic.go:334] "Generic (PLEG): container finished" podID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerID="1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14" exitCode=0 Oct 02 11:44:00 crc kubenswrapper[4658]: I1002 11:44:00.858950 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l6n7" event={"ID":"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7","Type":"ContainerDied","Data":"1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14"} Oct 02 11:44:00 crc kubenswrapper[4658]: I1002 11:44:00.858983 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l6n7" event={"ID":"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7","Type":"ContainerStarted","Data":"a16ac4ac5e85803e303d7dabded9b777fcec1387a84f9d36423a36e9dbb06f1a"} Oct 02 11:44:01 crc kubenswrapper[4658]: I1002 11:44:01.869555 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l6n7" event={"ID":"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7","Type":"ContainerStarted","Data":"09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb"} Oct 02 11:44:02 crc kubenswrapper[4658]: I1002 11:44:02.883521 4658 generic.go:334] "Generic (PLEG): container finished" podID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerID="09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb" exitCode=0 Oct 02 11:44:02 crc kubenswrapper[4658]: I1002 11:44:02.883597 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l6n7" event={"ID":"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7","Type":"ContainerDied","Data":"09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb"} Oct 02 11:44:03 crc kubenswrapper[4658]: I1002 11:44:03.897273 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l6n7" event={"ID":"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7","Type":"ContainerStarted","Data":"63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d"} Oct 02 11:44:03 crc kubenswrapper[4658]: I1002 11:44:03.918839 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8l6n7" podStartSLOduration=2.390500024 podStartE2EDuration="4.918822876s" podCreationTimestamp="2025-10-02 11:43:59 +0000 UTC" firstStartedPulling="2025-10-02 11:44:00.861933829 +0000 UTC m=+1521.753087396" lastFinishedPulling="2025-10-02 11:44:03.390256681 +0000 UTC m=+1524.281410248" observedRunningTime="2025-10-02 11:44:03.918287248 +0000 UTC m=+1524.809440855" watchObservedRunningTime="2025-10-02 11:44:03.918822876 +0000 UTC m=+1524.809976443" Oct 02 11:44:09 crc kubenswrapper[4658]: I1002 11:44:09.644360 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:44:09 crc kubenswrapper[4658]: I1002 11:44:09.644942 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:44:09 crc kubenswrapper[4658]: I1002 11:44:09.708736 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:44:10 crc kubenswrapper[4658]: I1002 11:44:10.035169 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:44:10 crc kubenswrapper[4658]: I1002 11:44:10.085005 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8l6n7"] Oct 02 11:44:11 crc kubenswrapper[4658]: I1002 11:44:11.983557 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8l6n7" podUID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerName="registry-server" containerID="cri-o://63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d" gracePeriod=2 Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.458936 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.620331 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnffl\" (UniqueName: \"kubernetes.io/projected/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-kube-api-access-gnffl\") pod \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.620408 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-catalog-content\") pod \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.620491 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-utilities\") pod \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\" (UID: \"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7\") " Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.621755 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-utilities" (OuterVolumeSpecName: "utilities") pod "90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" (UID: "90a7b472-a62b-44c6-9a0f-9aa66b18a3c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.625806 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-kube-api-access-gnffl" (OuterVolumeSpecName: "kube-api-access-gnffl") pod "90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" (UID: "90a7b472-a62b-44c6-9a0f-9aa66b18a3c7"). InnerVolumeSpecName "kube-api-access-gnffl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.663954 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" (UID: "90a7b472-a62b-44c6-9a0f-9aa66b18a3c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.722640 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.722690 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.722702 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnffl\" (UniqueName: \"kubernetes.io/projected/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7-kube-api-access-gnffl\") on node \"crc\" DevicePath \"\"" Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.998669 4658 generic.go:334] "Generic (PLEG): container finished" podID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerID="63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d" exitCode=0 Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.998709 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l6n7" event={"ID":"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7","Type":"ContainerDied","Data":"63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d"} Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.998736 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l6n7" event={"ID":"90a7b472-a62b-44c6-9a0f-9aa66b18a3c7","Type":"ContainerDied","Data":"a16ac4ac5e85803e303d7dabded9b777fcec1387a84f9d36423a36e9dbb06f1a"} Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.998753 4658 scope.go:117] "RemoveContainer" containerID="63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d" Oct 02 11:44:12 crc kubenswrapper[4658]: I1002 11:44:12.998748 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l6n7" Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.035195 4658 scope.go:117] "RemoveContainer" containerID="09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb" Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.040453 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8l6n7"] Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.055187 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8l6n7"] Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.057968 4658 scope.go:117] "RemoveContainer" containerID="1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14" Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.106641 4658 scope.go:117] "RemoveContainer" containerID="63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d" Oct 02 11:44:13 crc kubenswrapper[4658]: E1002 11:44:13.113201 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d\": container with ID starting with 63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d not found: ID does not exist" containerID="63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d" Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.113263 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d"} err="failed to get container status \"63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d\": rpc error: code = NotFound desc = could not find container \"63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d\": container with ID starting with 63410d9d366938ffa559b155c806b455f7979cb7ed079ba5f5b4d9e98ce8455d not found: ID does not exist" Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.113323 4658 scope.go:117] "RemoveContainer" containerID="09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb" Oct 02 11:44:13 crc kubenswrapper[4658]: E1002 11:44:13.113757 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb\": container with ID starting with 09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb not found: ID does not exist" containerID="09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb" Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.113793 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb"} err="failed to get container status \"09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb\": rpc error: code = NotFound desc = could not find container \"09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb\": container with ID starting with 09a81b29d3668374aee020e809d603845293618e8e5d141ac05afb8c74e3d4eb not found: ID does not exist" Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.113817 4658 scope.go:117] "RemoveContainer" containerID="1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14" Oct 02 11:44:13 crc kubenswrapper[4658]: E1002 11:44:13.114360 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14\": container with ID starting with 1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14 not found: ID does not exist" containerID="1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14" Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.114400 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14"} err="failed to get container status \"1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14\": rpc error: code = NotFound desc = could not find container \"1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14\": container with ID starting with 1254c695feff685709505039ae9ab843fe36e7bdb2f9635606c78ab5d5188d14 not found: ID does not exist" Oct 02 11:44:13 crc kubenswrapper[4658]: I1002 11:44:13.972599 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" path="/var/lib/kubelet/pods/90a7b472-a62b-44c6-9a0f-9aa66b18a3c7/volumes" Oct 02 11:44:27 crc kubenswrapper[4658]: I1002 11:44:27.429969 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:44:27 crc kubenswrapper[4658]: I1002 11:44:27.430583 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:44:27 crc kubenswrapper[4658]: I1002 11:44:27.430639 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:44:27 crc kubenswrapper[4658]: I1002 11:44:27.431605 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:44:27 crc kubenswrapper[4658]: I1002 11:44:27.431697 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" gracePeriod=600 Oct 02 11:44:27 crc kubenswrapper[4658]: E1002 11:44:27.593336 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:44:28 crc kubenswrapper[4658]: I1002 11:44:28.154493 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" exitCode=0 Oct 02 11:44:28 crc kubenswrapper[4658]: I1002 11:44:28.154559 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f"} Oct 02 11:44:28 crc kubenswrapper[4658]: I1002 11:44:28.154622 4658 scope.go:117] "RemoveContainer" containerID="291f0b40b657899a41b0a5366c5b61d4ebf6b86816e301bb8cd5cf300e7b2e11" Oct 02 11:44:28 crc kubenswrapper[4658]: I1002 11:44:28.155458 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:44:28 crc kubenswrapper[4658]: E1002 11:44:28.155748 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:44:41 crc kubenswrapper[4658]: I1002 11:44:41.949516 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:44:41 crc kubenswrapper[4658]: E1002 11:44:41.950232 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:44:56 crc kubenswrapper[4658]: I1002 11:44:56.951202 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:44:56 crc kubenswrapper[4658]: E1002 11:44:56.952096 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.372543 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hnl5w"] Oct 02 11:44:59 crc kubenswrapper[4658]: E1002 11:44:59.374333 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerName="extract-utilities" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.374375 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerName="extract-utilities" Oct 02 11:44:59 crc kubenswrapper[4658]: E1002 11:44:59.374407 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerName="registry-server" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.374415 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerName="registry-server" Oct 02 11:44:59 crc kubenswrapper[4658]: E1002 11:44:59.374478 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerName="extract-content" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.374488 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerName="extract-content" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.375194 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a7b472-a62b-44c6-9a0f-9aa66b18a3c7" containerName="registry-server" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.382146 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.426544 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnl5w"] Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.470796 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hvf\" (UniqueName: \"kubernetes.io/projected/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-kube-api-access-l6hvf\") pod \"redhat-marketplace-hnl5w\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.470880 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-utilities\") pod \"redhat-marketplace-hnl5w\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.470958 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-catalog-content\") pod \"redhat-marketplace-hnl5w\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.572791 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hvf\" (UniqueName: \"kubernetes.io/projected/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-kube-api-access-l6hvf\") pod \"redhat-marketplace-hnl5w\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.573132 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-utilities\") pod \"redhat-marketplace-hnl5w\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.573312 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-catalog-content\") pod \"redhat-marketplace-hnl5w\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.573749 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-utilities\") pod \"redhat-marketplace-hnl5w\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.573793 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-catalog-content\") pod \"redhat-marketplace-hnl5w\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.593794 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hvf\" (UniqueName: \"kubernetes.io/projected/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-kube-api-access-l6hvf\") pod \"redhat-marketplace-hnl5w\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:44:59 crc kubenswrapper[4658]: I1002 11:44:59.725704 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.154105 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn"] Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.156211 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.163066 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.163364 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.172263 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn"] Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.241352 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnl5w"] Oct 02 11:45:00 crc kubenswrapper[4658]: W1002 11:45:00.242442 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8b0ebb3_1fed_4754_82cc_3ffffc547c16.slice/crio-b9eb9fbf20d83f52d4b2c4d1ad48a7f03c926a9a38d9e7c0de03575b9836dade WatchSource:0}: Error finding container b9eb9fbf20d83f52d4b2c4d1ad48a7f03c926a9a38d9e7c0de03575b9836dade: Status 404 returned error can't find the container with id b9eb9fbf20d83f52d4b2c4d1ad48a7f03c926a9a38d9e7c0de03575b9836dade Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.302424 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71b9acdf-9c53-4731-91a0-3126924ff057-config-volume\") pod \"collect-profiles-29323425-g2jcn\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.302489 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mx58\" (UniqueName: \"kubernetes.io/projected/71b9acdf-9c53-4731-91a0-3126924ff057-kube-api-access-7mx58\") pod \"collect-profiles-29323425-g2jcn\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.302961 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71b9acdf-9c53-4731-91a0-3126924ff057-secret-volume\") pod \"collect-profiles-29323425-g2jcn\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.404449 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71b9acdf-9c53-4731-91a0-3126924ff057-secret-volume\") pod \"collect-profiles-29323425-g2jcn\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.404543 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71b9acdf-9c53-4731-91a0-3126924ff057-config-volume\") pod \"collect-profiles-29323425-g2jcn\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.404586 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mx58\" (UniqueName: \"kubernetes.io/projected/71b9acdf-9c53-4731-91a0-3126924ff057-kube-api-access-7mx58\") pod \"collect-profiles-29323425-g2jcn\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.405705 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71b9acdf-9c53-4731-91a0-3126924ff057-config-volume\") pod \"collect-profiles-29323425-g2jcn\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.415331 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71b9acdf-9c53-4731-91a0-3126924ff057-secret-volume\") pod \"collect-profiles-29323425-g2jcn\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.428285 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mx58\" (UniqueName: \"kubernetes.io/projected/71b9acdf-9c53-4731-91a0-3126924ff057-kube-api-access-7mx58\") pod \"collect-profiles-29323425-g2jcn\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.483548 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnl5w" event={"ID":"d8b0ebb3-1fed-4754-82cc-3ffffc547c16","Type":"ContainerStarted","Data":"95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c"} Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.483946 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnl5w" event={"ID":"d8b0ebb3-1fed-4754-82cc-3ffffc547c16","Type":"ContainerStarted","Data":"b9eb9fbf20d83f52d4b2c4d1ad48a7f03c926a9a38d9e7c0de03575b9836dade"} Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.491857 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:00 crc kubenswrapper[4658]: I1002 11:45:00.995455 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn"] Oct 02 11:45:01 crc kubenswrapper[4658]: I1002 11:45:01.494894 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" event={"ID":"71b9acdf-9c53-4731-91a0-3126924ff057","Type":"ContainerStarted","Data":"947a0011ab783f6fbb3dbd611b93135d065ed3f32f324114e33ba4baa74ec847"} Oct 02 11:45:01 crc kubenswrapper[4658]: I1002 11:45:01.494942 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" event={"ID":"71b9acdf-9c53-4731-91a0-3126924ff057","Type":"ContainerStarted","Data":"9f852c20a9ade36ac5cd65c7921124025d91d5fdf6c65e8bc16a43e9ed816fb8"} Oct 02 11:45:01 crc kubenswrapper[4658]: I1002 11:45:01.498590 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnl5w" event={"ID":"d8b0ebb3-1fed-4754-82cc-3ffffc547c16","Type":"ContainerDied","Data":"95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c"} Oct 02 11:45:01 crc kubenswrapper[4658]: I1002 11:45:01.498464 4658 generic.go:334] "Generic (PLEG): container finished" podID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerID="95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c" exitCode=0 Oct 02 11:45:01 crc kubenswrapper[4658]: I1002 11:45:01.500598 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:45:01 crc kubenswrapper[4658]: I1002 11:45:01.518183 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" podStartSLOduration=1.518159823 podStartE2EDuration="1.518159823s" podCreationTimestamp="2025-10-02 11:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:45:01.509466288 +0000 UTC m=+1582.400619855" watchObservedRunningTime="2025-10-02 11:45:01.518159823 +0000 UTC m=+1582.409313410" Oct 02 11:45:02 crc kubenswrapper[4658]: I1002 11:45:02.510799 4658 generic.go:334] "Generic (PLEG): container finished" podID="71b9acdf-9c53-4731-91a0-3126924ff057" containerID="947a0011ab783f6fbb3dbd611b93135d065ed3f32f324114e33ba4baa74ec847" exitCode=0 Oct 02 11:45:02 crc kubenswrapper[4658]: I1002 11:45:02.510951 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" event={"ID":"71b9acdf-9c53-4731-91a0-3126924ff057","Type":"ContainerDied","Data":"947a0011ab783f6fbb3dbd611b93135d065ed3f32f324114e33ba4baa74ec847"} Oct 02 11:45:03 crc kubenswrapper[4658]: I1002 11:45:03.521563 4658 generic.go:334] "Generic (PLEG): container finished" podID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerID="21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b" exitCode=0 Oct 02 11:45:03 crc kubenswrapper[4658]: I1002 11:45:03.521682 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnl5w" event={"ID":"d8b0ebb3-1fed-4754-82cc-3ffffc547c16","Type":"ContainerDied","Data":"21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b"} Oct 02 11:45:03 crc kubenswrapper[4658]: I1002 11:45:03.874351 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:03 crc kubenswrapper[4658]: I1002 11:45:03.983785 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mx58\" (UniqueName: \"kubernetes.io/projected/71b9acdf-9c53-4731-91a0-3126924ff057-kube-api-access-7mx58\") pod \"71b9acdf-9c53-4731-91a0-3126924ff057\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " Oct 02 11:45:03 crc kubenswrapper[4658]: I1002 11:45:03.984176 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71b9acdf-9c53-4731-91a0-3126924ff057-config-volume\") pod \"71b9acdf-9c53-4731-91a0-3126924ff057\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " Oct 02 11:45:03 crc kubenswrapper[4658]: I1002 11:45:03.984336 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71b9acdf-9c53-4731-91a0-3126924ff057-secret-volume\") pod \"71b9acdf-9c53-4731-91a0-3126924ff057\" (UID: \"71b9acdf-9c53-4731-91a0-3126924ff057\") " Oct 02 11:45:03 crc kubenswrapper[4658]: I1002 11:45:03.985690 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b9acdf-9c53-4731-91a0-3126924ff057-config-volume" (OuterVolumeSpecName: "config-volume") pod "71b9acdf-9c53-4731-91a0-3126924ff057" (UID: "71b9acdf-9c53-4731-91a0-3126924ff057"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4658]: I1002 11:45:03.991098 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b9acdf-9c53-4731-91a0-3126924ff057-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "71b9acdf-9c53-4731-91a0-3126924ff057" (UID: "71b9acdf-9c53-4731-91a0-3126924ff057"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4658]: I1002 11:45:03.991477 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b9acdf-9c53-4731-91a0-3126924ff057-kube-api-access-7mx58" (OuterVolumeSpecName: "kube-api-access-7mx58") pod "71b9acdf-9c53-4731-91a0-3126924ff057" (UID: "71b9acdf-9c53-4731-91a0-3126924ff057"). InnerVolumeSpecName "kube-api-access-7mx58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:45:04 crc kubenswrapper[4658]: I1002 11:45:04.087029 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mx58\" (UniqueName: \"kubernetes.io/projected/71b9acdf-9c53-4731-91a0-3126924ff057-kube-api-access-7mx58\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:04 crc kubenswrapper[4658]: I1002 11:45:04.087068 4658 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71b9acdf-9c53-4731-91a0-3126924ff057-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:04 crc kubenswrapper[4658]: I1002 11:45:04.087082 4658 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71b9acdf-9c53-4731-91a0-3126924ff057-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:04 crc kubenswrapper[4658]: I1002 11:45:04.533673 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnl5w" event={"ID":"d8b0ebb3-1fed-4754-82cc-3ffffc547c16","Type":"ContainerStarted","Data":"33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e"} Oct 02 11:45:04 crc kubenswrapper[4658]: I1002 11:45:04.536482 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" event={"ID":"71b9acdf-9c53-4731-91a0-3126924ff057","Type":"ContainerDied","Data":"9f852c20a9ade36ac5cd65c7921124025d91d5fdf6c65e8bc16a43e9ed816fb8"} Oct 02 11:45:04 crc kubenswrapper[4658]: I1002 11:45:04.536526 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f852c20a9ade36ac5cd65c7921124025d91d5fdf6c65e8bc16a43e9ed816fb8" Oct 02 11:45:04 crc kubenswrapper[4658]: I1002 11:45:04.536540 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn" Oct 02 11:45:04 crc kubenswrapper[4658]: I1002 11:45:04.568519 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hnl5w" podStartSLOduration=3.024489797 podStartE2EDuration="5.568499854s" podCreationTimestamp="2025-10-02 11:44:59 +0000 UTC" firstStartedPulling="2025-10-02 11:45:01.500323087 +0000 UTC m=+1582.391476654" lastFinishedPulling="2025-10-02 11:45:04.044333144 +0000 UTC m=+1584.935486711" observedRunningTime="2025-10-02 11:45:04.55951621 +0000 UTC m=+1585.450669787" watchObservedRunningTime="2025-10-02 11:45:04.568499854 +0000 UTC m=+1585.459653421" Oct 02 11:45:07 crc kubenswrapper[4658]: I1002 11:45:07.949577 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:45:07 crc kubenswrapper[4658]: E1002 11:45:07.950456 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:45:09 crc kubenswrapper[4658]: I1002 11:45:09.726582 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:45:09 crc kubenswrapper[4658]: I1002 11:45:09.727586 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:45:09 crc kubenswrapper[4658]: I1002 11:45:09.778011 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:45:10 crc kubenswrapper[4658]: I1002 11:45:10.633046 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:45:10 crc kubenswrapper[4658]: I1002 11:45:10.692383 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnl5w"] Oct 02 11:45:11 crc kubenswrapper[4658]: I1002 11:45:11.601068 4658 generic.go:334] "Generic (PLEG): container finished" podID="3e768ea4-04c3-4825-9431-a37f41f34a01" containerID="8d3d6fa6fdeab877df7f3591d4fe5a45fd6361963bcfd6ad52a8e6a69f257b9c" exitCode=0 Oct 02 11:45:11 crc kubenswrapper[4658]: I1002 11:45:11.602106 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" event={"ID":"3e768ea4-04c3-4825-9431-a37f41f34a01","Type":"ContainerDied","Data":"8d3d6fa6fdeab877df7f3591d4fe5a45fd6361963bcfd6ad52a8e6a69f257b9c"} Oct 02 11:45:12 crc kubenswrapper[4658]: I1002 11:45:12.614131 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hnl5w" podUID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerName="registry-server" containerID="cri-o://33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e" gracePeriod=2 Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.095258 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.105682 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.164068 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6hvf\" (UniqueName: \"kubernetes.io/projected/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-kube-api-access-l6hvf\") pod \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.164156 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-catalog-content\") pod \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.164189 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brfh7\" (UniqueName: \"kubernetes.io/projected/3e768ea4-04c3-4825-9431-a37f41f34a01-kube-api-access-brfh7\") pod \"3e768ea4-04c3-4825-9431-a37f41f34a01\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.164231 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-bootstrap-combined-ca-bundle\") pod \"3e768ea4-04c3-4825-9431-a37f41f34a01\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.164270 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-inventory\") pod \"3e768ea4-04c3-4825-9431-a37f41f34a01\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.164382 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-ssh-key\") pod \"3e768ea4-04c3-4825-9431-a37f41f34a01\" (UID: \"3e768ea4-04c3-4825-9431-a37f41f34a01\") " Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.164487 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-utilities\") pod \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\" (UID: \"d8b0ebb3-1fed-4754-82cc-3ffffc547c16\") " Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.165690 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-utilities" (OuterVolumeSpecName: "utilities") pod "d8b0ebb3-1fed-4754-82cc-3ffffc547c16" (UID: "d8b0ebb3-1fed-4754-82cc-3ffffc547c16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.171112 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3e768ea4-04c3-4825-9431-a37f41f34a01" (UID: "3e768ea4-04c3-4825-9431-a37f41f34a01"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.171401 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e768ea4-04c3-4825-9431-a37f41f34a01-kube-api-access-brfh7" (OuterVolumeSpecName: "kube-api-access-brfh7") pod "3e768ea4-04c3-4825-9431-a37f41f34a01" (UID: "3e768ea4-04c3-4825-9431-a37f41f34a01"). InnerVolumeSpecName "kube-api-access-brfh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.171726 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-kube-api-access-l6hvf" (OuterVolumeSpecName: "kube-api-access-l6hvf") pod "d8b0ebb3-1fed-4754-82cc-3ffffc547c16" (UID: "d8b0ebb3-1fed-4754-82cc-3ffffc547c16"). InnerVolumeSpecName "kube-api-access-l6hvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.181767 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8b0ebb3-1fed-4754-82cc-3ffffc547c16" (UID: "d8b0ebb3-1fed-4754-82cc-3ffffc547c16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.194058 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e768ea4-04c3-4825-9431-a37f41f34a01" (UID: "3e768ea4-04c3-4825-9431-a37f41f34a01"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.195032 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-inventory" (OuterVolumeSpecName: "inventory") pod "3e768ea4-04c3-4825-9431-a37f41f34a01" (UID: "3e768ea4-04c3-4825-9431-a37f41f34a01"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.266414 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.266451 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6hvf\" (UniqueName: \"kubernetes.io/projected/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-kube-api-access-l6hvf\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.266465 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b0ebb3-1fed-4754-82cc-3ffffc547c16-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.266477 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brfh7\" (UniqueName: \"kubernetes.io/projected/3e768ea4-04c3-4825-9431-a37f41f34a01-kube-api-access-brfh7\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.266489 4658 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.266500 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.266509 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e768ea4-04c3-4825-9431-a37f41f34a01-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.627957 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.627954 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx" event={"ID":"3e768ea4-04c3-4825-9431-a37f41f34a01","Type":"ContainerDied","Data":"437b8fccd561715cdc2047437c3cb33a509139966c0ee8ea7d62b67f3f6ececb"} Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.628415 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="437b8fccd561715cdc2047437c3cb33a509139966c0ee8ea7d62b67f3f6ececb" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.630907 4658 generic.go:334] "Generic (PLEG): container finished" podID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerID="33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e" exitCode=0 Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.630955 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnl5w" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.630962 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnl5w" event={"ID":"d8b0ebb3-1fed-4754-82cc-3ffffc547c16","Type":"ContainerDied","Data":"33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e"} Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.630987 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnl5w" event={"ID":"d8b0ebb3-1fed-4754-82cc-3ffffc547c16","Type":"ContainerDied","Data":"b9eb9fbf20d83f52d4b2c4d1ad48a7f03c926a9a38d9e7c0de03575b9836dade"} Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.631007 4658 scope.go:117] "RemoveContainer" containerID="33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.686281 4658 scope.go:117] "RemoveContainer" containerID="21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.708196 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnl5w"] Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.720373 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnl5w"] Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.734364 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f"] Oct 02 11:45:13 crc kubenswrapper[4658]: E1002 11:45:13.735085 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerName="registry-server" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.735187 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerName="registry-server" Oct 02 11:45:13 crc kubenswrapper[4658]: E1002 11:45:13.735283 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerName="extract-utilities" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.735487 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerName="extract-utilities" Oct 02 11:45:13 crc kubenswrapper[4658]: E1002 11:45:13.735557 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b9acdf-9c53-4731-91a0-3126924ff057" containerName="collect-profiles" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.735626 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b9acdf-9c53-4731-91a0-3126924ff057" containerName="collect-profiles" Oct 02 11:45:13 crc kubenswrapper[4658]: E1002 11:45:13.735718 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerName="extract-content" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.735783 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerName="extract-content" Oct 02 11:45:13 crc kubenswrapper[4658]: E1002 11:45:13.735863 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e768ea4-04c3-4825-9431-a37f41f34a01" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.735927 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e768ea4-04c3-4825-9431-a37f41f34a01" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.736230 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" containerName="registry-server" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.736340 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b9acdf-9c53-4731-91a0-3126924ff057" containerName="collect-profiles" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.736440 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e768ea4-04c3-4825-9431-a37f41f34a01" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.737408 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.741247 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.741508 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.741377 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.741418 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.749485 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f"] Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.767511 4658 scope.go:117] "RemoveContainer" containerID="95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.786351 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6b65f\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.786490 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6b65f\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.786575 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vjj\" (UniqueName: \"kubernetes.io/projected/43792b79-e840-4c83-b2b9-8068765b000a-kube-api-access-46vjj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6b65f\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.810245 4658 scope.go:117] "RemoveContainer" containerID="33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e" Oct 02 11:45:13 crc kubenswrapper[4658]: E1002 11:45:13.811554 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e\": container with ID starting with 33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e not found: ID does not exist" containerID="33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.811610 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e"} err="failed to get container status \"33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e\": rpc error: code = NotFound desc = could not find container \"33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e\": container with ID starting with 33a688ab52cdd38700c89797c0255b381d5ac573ac934a0e0c537c436377c19e not found: ID does not exist" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.811645 4658 scope.go:117] "RemoveContainer" containerID="21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b" Oct 02 11:45:13 crc kubenswrapper[4658]: E1002 11:45:13.812186 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b\": container with ID starting with 21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b not found: ID does not exist" containerID="21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.812237 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b"} err="failed to get container status \"21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b\": rpc error: code = NotFound desc = could not find container \"21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b\": container with ID starting with 21656ea6ad41c6954ace96a54f91dc6f521e72d3e92b1f63cd4c67fce2cd2f7b not found: ID does not exist" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.812269 4658 scope.go:117] "RemoveContainer" containerID="95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c" Oct 02 11:45:13 crc kubenswrapper[4658]: E1002 11:45:13.812605 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c\": container with ID starting with 95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c not found: ID does not exist" containerID="95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.812636 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c"} err="failed to get container status \"95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c\": rpc error: code = NotFound desc = could not find container \"95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c\": container with ID starting with 95b5cd4012753b7fed3419c96aa8c71a30ea13e9618f04035ddd589b257c371c not found: ID does not exist" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.888485 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6b65f\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.888609 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6b65f\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.888696 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46vjj\" (UniqueName: \"kubernetes.io/projected/43792b79-e840-4c83-b2b9-8068765b000a-kube-api-access-46vjj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6b65f\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.893375 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6b65f\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.893475 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6b65f\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.908640 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vjj\" (UniqueName: \"kubernetes.io/projected/43792b79-e840-4c83-b2b9-8068765b000a-kube-api-access-46vjj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6b65f\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:13 crc kubenswrapper[4658]: I1002 11:45:13.966647 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b0ebb3-1fed-4754-82cc-3ffffc547c16" path="/var/lib/kubelet/pods/d8b0ebb3-1fed-4754-82cc-3ffffc547c16/volumes" Oct 02 11:45:14 crc kubenswrapper[4658]: I1002 11:45:14.079765 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:45:14 crc kubenswrapper[4658]: I1002 11:45:14.595029 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f"] Oct 02 11:45:14 crc kubenswrapper[4658]: I1002 11:45:14.638790 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" event={"ID":"43792b79-e840-4c83-b2b9-8068765b000a","Type":"ContainerStarted","Data":"8205ec1026e692684a63517da12a6cbd6362baabbb5410da13b9a68886659a5c"} Oct 02 11:45:15 crc kubenswrapper[4658]: I1002 11:45:15.651139 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" event={"ID":"43792b79-e840-4c83-b2b9-8068765b000a","Type":"ContainerStarted","Data":"01ce708cc316db25b251c71d2dab332c1311adf7cc265abfc02b173d8ea6f58d"} Oct 02 11:45:15 crc kubenswrapper[4658]: I1002 11:45:15.673434 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" podStartSLOduration=2.222635078 podStartE2EDuration="2.673414732s" podCreationTimestamp="2025-10-02 11:45:13 +0000 UTC" firstStartedPulling="2025-10-02 11:45:14.598782628 +0000 UTC m=+1595.489936195" lastFinishedPulling="2025-10-02 11:45:15.049562282 +0000 UTC m=+1595.940715849" observedRunningTime="2025-10-02 11:45:15.666813685 +0000 UTC m=+1596.557967262" watchObservedRunningTime="2025-10-02 11:45:15.673414732 +0000 UTC m=+1596.564568309" Oct 02 11:45:18 crc kubenswrapper[4658]: I1002 11:45:18.949273 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:45:18 crc kubenswrapper[4658]: E1002 11:45:18.950139 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:45:29 crc kubenswrapper[4658]: I1002 11:45:29.961016 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:45:29 crc kubenswrapper[4658]: E1002 11:45:29.961717 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:45:40 crc kubenswrapper[4658]: I1002 11:45:40.950267 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:45:40 crc kubenswrapper[4658]: E1002 11:45:40.953934 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:45:41 crc kubenswrapper[4658]: I1002 11:45:41.049368 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hqq9r"] Oct 02 11:45:41 crc kubenswrapper[4658]: I1002 11:45:41.058882 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-dggdx"] Oct 02 11:45:41 crc kubenswrapper[4658]: I1002 11:45:41.069959 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hqq9r"] Oct 02 11:45:41 crc kubenswrapper[4658]: I1002 11:45:41.079944 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-dggdx"] Oct 02 11:45:41 crc kubenswrapper[4658]: I1002 11:45:41.961153 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36cebe9d-b6af-4e46-83ad-ddafab15aefb" path="/var/lib/kubelet/pods/36cebe9d-b6af-4e46-83ad-ddafab15aefb/volumes" Oct 02 11:45:41 crc kubenswrapper[4658]: I1002 11:45:41.962176 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1907c4-bad7-43fe-8982-c4ea70df1a12" path="/var/lib/kubelet/pods/8d1907c4-bad7-43fe-8982-c4ea70df1a12/volumes" Oct 02 11:45:42 crc kubenswrapper[4658]: I1002 11:45:42.033765 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-h8l2l"] Oct 02 11:45:42 crc kubenswrapper[4658]: I1002 11:45:42.044072 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-h8l2l"] Oct 02 11:45:43 crc kubenswrapper[4658]: I1002 11:45:43.037384 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-mccmm"] Oct 02 11:45:43 crc kubenswrapper[4658]: I1002 11:45:43.045924 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-mccmm"] Oct 02 11:45:43 crc kubenswrapper[4658]: I1002 11:45:43.967666 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a91ffa-90bd-4ca9-9441-fccbed461ced" path="/var/lib/kubelet/pods/25a91ffa-90bd-4ca9-9441-fccbed461ced/volumes" Oct 02 11:45:43 crc kubenswrapper[4658]: I1002 11:45:43.968512 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65aa35f-e3ca-4da7-af49-56f8c1af3e0e" path="/var/lib/kubelet/pods/c65aa35f-e3ca-4da7-af49-56f8c1af3e0e/volumes" Oct 02 11:45:51 crc kubenswrapper[4658]: I1002 11:45:51.035187 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-bbc1-account-create-47jjw"] Oct 02 11:45:51 crc kubenswrapper[4658]: I1002 11:45:51.044952 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6ffc-account-create-2fxc9"] Oct 02 11:45:51 crc kubenswrapper[4658]: I1002 11:45:51.053640 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6ffc-account-create-2fxc9"] Oct 02 11:45:51 crc kubenswrapper[4658]: I1002 11:45:51.061328 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-bbc1-account-create-47jjw"] Oct 02 11:45:51 crc kubenswrapper[4658]: I1002 11:45:51.948852 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:45:51 crc kubenswrapper[4658]: E1002 11:45:51.949174 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:45:51 crc kubenswrapper[4658]: I1002 11:45:51.961658 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e00451-8e3d-4e55-935c-7df7a71c261e" path="/var/lib/kubelet/pods/41e00451-8e3d-4e55-935c-7df7a71c261e/volumes" Oct 02 11:45:51 crc kubenswrapper[4658]: I1002 11:45:51.962606 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d84d533-3387-4b94-b519-c354db47dea0" path="/var/lib/kubelet/pods/9d84d533-3387-4b94-b519-c354db47dea0/volumes" Oct 02 11:45:52 crc kubenswrapper[4658]: I1002 11:45:52.026957 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-95c4-account-create-wfwww"] Oct 02 11:45:52 crc kubenswrapper[4658]: I1002 11:45:52.036903 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-95c4-account-create-wfwww"] Oct 02 11:45:53 crc kubenswrapper[4658]: I1002 11:45:53.028285 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-5eba-account-create-4bnj5"] Oct 02 11:45:53 crc kubenswrapper[4658]: I1002 11:45:53.037891 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-5eba-account-create-4bnj5"] Oct 02 11:45:53 crc kubenswrapper[4658]: I1002 11:45:53.961997 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56dc1de8-4f80-4a7b-b461-6b5e830a889b" path="/var/lib/kubelet/pods/56dc1de8-4f80-4a7b-b461-6b5e830a889b/volumes" Oct 02 11:45:53 crc kubenswrapper[4658]: I1002 11:45:53.963651 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45d37dd-6bcd-4d3d-ab46-dabd8242a213" path="/var/lib/kubelet/pods/e45d37dd-6bcd-4d3d-ab46-dabd8242a213/volumes" Oct 02 11:45:59 crc kubenswrapper[4658]: I1002 11:45:59.029257 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hdzkf"] Oct 02 11:45:59 crc kubenswrapper[4658]: I1002 11:45:59.056006 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hdzkf"] Oct 02 11:45:59 crc kubenswrapper[4658]: I1002 11:45:59.961202 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ba0196-c5f9-40ea-b43b-24c2d9e9ad60" path="/var/lib/kubelet/pods/65ba0196-c5f9-40ea-b43b-24c2d9e9ad60/volumes" Oct 02 11:46:02 crc kubenswrapper[4658]: I1002 11:46:02.949533 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:46:02 crc kubenswrapper[4658]: E1002 11:46:02.950552 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:46:03 crc kubenswrapper[4658]: I1002 11:46:03.040256 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-g7rkz"] Oct 02 11:46:03 crc kubenswrapper[4658]: I1002 11:46:03.054349 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-pd2cm"] Oct 02 11:46:03 crc kubenswrapper[4658]: I1002 11:46:03.067251 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-g7rkz"] Oct 02 11:46:03 crc kubenswrapper[4658]: I1002 11:46:03.078570 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-pd2cm"] Oct 02 11:46:03 crc kubenswrapper[4658]: I1002 11:46:03.961125 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="721f38ad-db77-4b36-aa92-0c5ea5821709" path="/var/lib/kubelet/pods/721f38ad-db77-4b36-aa92-0c5ea5821709/volumes" Oct 02 11:46:03 crc kubenswrapper[4658]: I1002 11:46:03.963772 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba10cce-80f9-474b-aece-681f238af730" path="/var/lib/kubelet/pods/7ba10cce-80f9-474b-aece-681f238af730/volumes" Oct 02 11:46:13 crc kubenswrapper[4658]: I1002 11:46:13.948750 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:46:13 crc kubenswrapper[4658]: E1002 11:46:13.949522 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.472054 4658 scope.go:117] "RemoveContainer" containerID="c8a33a3288e8c0b84b45ccd48df105b38969ec319dba5613e33d13edeb538045" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.497380 4658 scope.go:117] "RemoveContainer" containerID="afeb8dde19966c2f08a27ba588a4a3c8076e6262b0da3a788fee0aa1934f11f0" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.539119 4658 scope.go:117] "RemoveContainer" containerID="8d6b19e7cf64f6333f6b7f64511c0ce3de509dd36acc8f9d4967e05a1ea47c59" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.584122 4658 scope.go:117] "RemoveContainer" containerID="b2aa095548c3094b11dbc20535bc980660ad848e8d2106b9b0157da4febf98e2" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.648626 4658 scope.go:117] "RemoveContainer" containerID="1f72312c38b0d2a1dfdcf6b43921ffb38074ef2c54fcdbdf219a4af1f9eb1dbf" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.731909 4658 scope.go:117] "RemoveContainer" containerID="76ea77dc33c44501a56b24922634e9c587ba3159351f3eb5f52f83f91a9090f8" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.775596 4658 scope.go:117] "RemoveContainer" containerID="e66c67cff56fbb8920a3618d65a9a75b4462f084982b05dc57a21f19ec9ed343" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.803721 4658 scope.go:117] "RemoveContainer" containerID="227fe2f696a1aecbd575f7868fc7a7d728895e8dac4c85d318a49efe412cb590" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.824393 4658 scope.go:117] "RemoveContainer" containerID="f6f34af0f1cc13ac3cd33a10c18ee05641736727140c259693baba2af2659b52" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.843863 4658 scope.go:117] "RemoveContainer" containerID="c2e4fe080dad8831fce639647d70a7f9b9ece51b748726dcc1239e48a71df7c4" Oct 02 11:46:21 crc kubenswrapper[4658]: I1002 11:46:21.867911 4658 scope.go:117] "RemoveContainer" containerID="25789d8280b0feee6a663dba100a82e9f3725e5e1f99eca23a03ea3861044b67" Oct 02 11:46:22 crc kubenswrapper[4658]: I1002 11:46:22.039389 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d0c3-account-create-kbt5q"] Oct 02 11:46:22 crc kubenswrapper[4658]: I1002 11:46:22.053286 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-be39-account-create-j7lbd"] Oct 02 11:46:22 crc kubenswrapper[4658]: I1002 11:46:22.062474 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b069-account-create-mrfzg"] Oct 02 11:46:22 crc kubenswrapper[4658]: I1002 11:46:22.071108 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d0c3-account-create-kbt5q"] Oct 02 11:46:22 crc kubenswrapper[4658]: I1002 11:46:22.080349 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-be39-account-create-j7lbd"] Oct 02 11:46:22 crc kubenswrapper[4658]: I1002 11:46:22.087795 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b069-account-create-mrfzg"] Oct 02 11:46:23 crc kubenswrapper[4658]: I1002 11:46:23.961218 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a11803-3f64-4028-b71d-bab0c3e89ec3" path="/var/lib/kubelet/pods/08a11803-3f64-4028-b71d-bab0c3e89ec3/volumes" Oct 02 11:46:23 crc kubenswrapper[4658]: I1002 11:46:23.962389 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f10f35-b78b-4238-8d60-917300aaa9ad" path="/var/lib/kubelet/pods/27f10f35-b78b-4238-8d60-917300aaa9ad/volumes" Oct 02 11:46:23 crc kubenswrapper[4658]: I1002 11:46:23.963071 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa" path="/var/lib/kubelet/pods/3e7c4276-4d14-4bf4-b5cf-b75f3f34cfaa/volumes" Oct 02 11:46:25 crc kubenswrapper[4658]: I1002 11:46:25.950940 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:46:25 crc kubenswrapper[4658]: E1002 11:46:25.951525 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:46:29 crc kubenswrapper[4658]: I1002 11:46:29.027701 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-cgghm"] Oct 02 11:46:29 crc kubenswrapper[4658]: I1002 11:46:29.035867 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-cgghm"] Oct 02 11:46:29 crc kubenswrapper[4658]: I1002 11:46:29.961381 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc08b85-172e-4a85-8c1a-dc6c713737fd" path="/var/lib/kubelet/pods/1bc08b85-172e-4a85-8c1a-dc6c713737fd/volumes" Oct 02 11:46:38 crc kubenswrapper[4658]: I1002 11:46:38.949032 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:46:38 crc kubenswrapper[4658]: E1002 11:46:38.950048 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:46:47 crc kubenswrapper[4658]: I1002 11:46:47.047640 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-ml5sj"] Oct 02 11:46:47 crc kubenswrapper[4658]: I1002 11:46:47.057264 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-ml5sj"] Oct 02 11:46:47 crc kubenswrapper[4658]: I1002 11:46:47.982957 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efa1ebca-0cdd-4bce-adf2-e8273c3448f1" path="/var/lib/kubelet/pods/efa1ebca-0cdd-4bce-adf2-e8273c3448f1/volumes" Oct 02 11:46:50 crc kubenswrapper[4658]: I1002 11:46:50.949282 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:46:50 crc kubenswrapper[4658]: E1002 11:46:50.950881 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:46:55 crc kubenswrapper[4658]: I1002 11:46:55.612335 4658 generic.go:334] "Generic (PLEG): container finished" podID="43792b79-e840-4c83-b2b9-8068765b000a" containerID="01ce708cc316db25b251c71d2dab332c1311adf7cc265abfc02b173d8ea6f58d" exitCode=0 Oct 02 11:46:55 crc kubenswrapper[4658]: I1002 11:46:55.612437 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" event={"ID":"43792b79-e840-4c83-b2b9-8068765b000a","Type":"ContainerDied","Data":"01ce708cc316db25b251c71d2dab332c1311adf7cc265abfc02b173d8ea6f58d"} Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.026337 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.123791 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46vjj\" (UniqueName: \"kubernetes.io/projected/43792b79-e840-4c83-b2b9-8068765b000a-kube-api-access-46vjj\") pod \"43792b79-e840-4c83-b2b9-8068765b000a\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.123948 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-ssh-key\") pod \"43792b79-e840-4c83-b2b9-8068765b000a\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.124004 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-inventory\") pod \"43792b79-e840-4c83-b2b9-8068765b000a\" (UID: \"43792b79-e840-4c83-b2b9-8068765b000a\") " Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.128875 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43792b79-e840-4c83-b2b9-8068765b000a-kube-api-access-46vjj" (OuterVolumeSpecName: "kube-api-access-46vjj") pod "43792b79-e840-4c83-b2b9-8068765b000a" (UID: "43792b79-e840-4c83-b2b9-8068765b000a"). InnerVolumeSpecName "kube-api-access-46vjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.151618 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-inventory" (OuterVolumeSpecName: "inventory") pod "43792b79-e840-4c83-b2b9-8068765b000a" (UID: "43792b79-e840-4c83-b2b9-8068765b000a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.163444 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "43792b79-e840-4c83-b2b9-8068765b000a" (UID: "43792b79-e840-4c83-b2b9-8068765b000a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.226809 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.226841 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43792b79-e840-4c83-b2b9-8068765b000a-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.226853 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46vjj\" (UniqueName: \"kubernetes.io/projected/43792b79-e840-4c83-b2b9-8068765b000a-kube-api-access-46vjj\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.657984 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" event={"ID":"43792b79-e840-4c83-b2b9-8068765b000a","Type":"ContainerDied","Data":"8205ec1026e692684a63517da12a6cbd6362baabbb5410da13b9a68886659a5c"} Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.658034 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8205ec1026e692684a63517da12a6cbd6362baabbb5410da13b9a68886659a5c" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.658053 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6b65f" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.715707 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn"] Oct 02 11:46:57 crc kubenswrapper[4658]: E1002 11:46:57.716163 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43792b79-e840-4c83-b2b9-8068765b000a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.716187 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="43792b79-e840-4c83-b2b9-8068765b000a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.716469 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="43792b79-e840-4c83-b2b9-8068765b000a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.717343 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.720345 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.720691 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.720846 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.721155 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.727072 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn"] Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.736202 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.736324 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.736366 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqcb4\" (UniqueName: \"kubernetes.io/projected/6eed4da6-fdf5-4db6-9e72-1d3052a54482-kube-api-access-pqcb4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.838436 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.838579 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.838635 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqcb4\" (UniqueName: \"kubernetes.io/projected/6eed4da6-fdf5-4db6-9e72-1d3052a54482-kube-api-access-pqcb4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.844085 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.846322 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:57 crc kubenswrapper[4658]: I1002 11:46:57.863658 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqcb4\" (UniqueName: \"kubernetes.io/projected/6eed4da6-fdf5-4db6-9e72-1d3052a54482-kube-api-access-pqcb4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:58 crc kubenswrapper[4658]: I1002 11:46:58.033255 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:46:58 crc kubenswrapper[4658]: I1002 11:46:58.584036 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn"] Oct 02 11:46:58 crc kubenswrapper[4658]: W1002 11:46:58.599147 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eed4da6_fdf5_4db6_9e72_1d3052a54482.slice/crio-1accf182787f8ce0d34e8e091373ebce028a836cb4d5c6d759fa5c56eb066861 WatchSource:0}: Error finding container 1accf182787f8ce0d34e8e091373ebce028a836cb4d5c6d759fa5c56eb066861: Status 404 returned error can't find the container with id 1accf182787f8ce0d34e8e091373ebce028a836cb4d5c6d759fa5c56eb066861 Oct 02 11:46:58 crc kubenswrapper[4658]: I1002 11:46:58.671228 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" event={"ID":"6eed4da6-fdf5-4db6-9e72-1d3052a54482","Type":"ContainerStarted","Data":"1accf182787f8ce0d34e8e091373ebce028a836cb4d5c6d759fa5c56eb066861"} Oct 02 11:46:59 crc kubenswrapper[4658]: I1002 11:46:59.688045 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" event={"ID":"6eed4da6-fdf5-4db6-9e72-1d3052a54482","Type":"ContainerStarted","Data":"eaae52704d260c4e23b70a782df4c98ac7a776c883d86aa4ed9c7da2960f5095"} Oct 02 11:46:59 crc kubenswrapper[4658]: I1002 11:46:59.713446 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" podStartSLOduration=2.035965753 podStartE2EDuration="2.713423577s" podCreationTimestamp="2025-10-02 11:46:57 +0000 UTC" firstStartedPulling="2025-10-02 11:46:58.603333399 +0000 UTC m=+1699.494486966" lastFinishedPulling="2025-10-02 11:46:59.280791203 +0000 UTC m=+1700.171944790" observedRunningTime="2025-10-02 11:46:59.702903168 +0000 UTC m=+1700.594056735" watchObservedRunningTime="2025-10-02 11:46:59.713423577 +0000 UTC m=+1700.604577144" Oct 02 11:47:05 crc kubenswrapper[4658]: I1002 11:47:05.949638 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:47:05 crc kubenswrapper[4658]: E1002 11:47:05.950417 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:47:06 crc kubenswrapper[4658]: I1002 11:47:06.038187 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-kbvvc"] Oct 02 11:47:06 crc kubenswrapper[4658]: I1002 11:47:06.070081 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-kbvvc"] Oct 02 11:47:07 crc kubenswrapper[4658]: I1002 11:47:07.959936 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a291e0-0291-4591-8d80-818338d6ae2d" path="/var/lib/kubelet/pods/87a291e0-0291-4591-8d80-818338d6ae2d/volumes" Oct 02 11:47:08 crc kubenswrapper[4658]: I1002 11:47:08.032833 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jftqc"] Oct 02 11:47:08 crc kubenswrapper[4658]: I1002 11:47:08.043217 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jftqc"] Oct 02 11:47:09 crc kubenswrapper[4658]: I1002 11:47:09.959480 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab" path="/var/lib/kubelet/pods/8e1d8712-d0e8-4ad3-83c2-c6b82a92bdab/volumes" Oct 02 11:47:15 crc kubenswrapper[4658]: I1002 11:47:15.029079 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-9hqkv"] Oct 02 11:47:15 crc kubenswrapper[4658]: I1002 11:47:15.039727 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-9hqkv"] Oct 02 11:47:15 crc kubenswrapper[4658]: I1002 11:47:15.968671 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4602160-442e-4a87-bacb-3493da6f4dad" path="/var/lib/kubelet/pods/a4602160-442e-4a87-bacb-3493da6f4dad/volumes" Oct 02 11:47:16 crc kubenswrapper[4658]: I1002 11:47:16.948745 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:47:16 crc kubenswrapper[4658]: E1002 11:47:16.948996 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:47:18 crc kubenswrapper[4658]: I1002 11:47:18.034156 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dc6wn"] Oct 02 11:47:18 crc kubenswrapper[4658]: I1002 11:47:18.041686 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dc6wn"] Oct 02 11:47:19 crc kubenswrapper[4658]: I1002 11:47:19.979471 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916133b3-3541-40ec-b32a-4b8bf4870d7f" path="/var/lib/kubelet/pods/916133b3-3541-40ec-b32a-4b8bf4870d7f/volumes" Oct 02 11:47:22 crc kubenswrapper[4658]: I1002 11:47:22.097951 4658 scope.go:117] "RemoveContainer" containerID="5d729cb5a7c5d82ac7670551e6f885beb68cc768a5939d3bd7e8d544a555ecaa" Oct 02 11:47:22 crc kubenswrapper[4658]: I1002 11:47:22.129459 4658 scope.go:117] "RemoveContainer" containerID="6a24cfcc9773fd7d06ec82cd910121ea08f1cec024a89c168796211b0510da67" Oct 02 11:47:22 crc kubenswrapper[4658]: I1002 11:47:22.188933 4658 scope.go:117] "RemoveContainer" containerID="b815b56b43296362dc4f3470f3d7e8ef1d65ff3d6f6ed7a1580287738ab2e409" Oct 02 11:47:22 crc kubenswrapper[4658]: I1002 11:47:22.235131 4658 scope.go:117] "RemoveContainer" containerID="806bda466dc507d954b7c5a64bd6585e9b0914e91020581cda090db9bef02e16" Oct 02 11:47:22 crc kubenswrapper[4658]: I1002 11:47:22.298707 4658 scope.go:117] "RemoveContainer" containerID="bb90bf4685877447040efedd5233a6b5c92c4c0acda94e1a5dd0473a65be42f7" Oct 02 11:47:22 crc kubenswrapper[4658]: I1002 11:47:22.337953 4658 scope.go:117] "RemoveContainer" containerID="2903d0a3f21ea87cd2d81cd948f6918ca802a6305e105e94d14a48da913c5027" Oct 02 11:47:22 crc kubenswrapper[4658]: I1002 11:47:22.385070 4658 scope.go:117] "RemoveContainer" containerID="654464221e8c58a97a0ab93c59e1723d53118315d2111f02fa611eb5b393d6e2" Oct 02 11:47:22 crc kubenswrapper[4658]: I1002 11:47:22.415664 4658 scope.go:117] "RemoveContainer" containerID="b239ce3207f783423497337b429199e1b0dbb924305ae106452a2eb4f3cec4b9" Oct 02 11:47:22 crc kubenswrapper[4658]: I1002 11:47:22.435496 4658 scope.go:117] "RemoveContainer" containerID="29374d3027afadc1219abc9cf8b80cf6e2cc79d695109e13531f4660ad1f4722" Oct 02 11:47:28 crc kubenswrapper[4658]: I1002 11:47:28.948988 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:47:28 crc kubenswrapper[4658]: E1002 11:47:28.949728 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:47:29 crc kubenswrapper[4658]: I1002 11:47:29.028572 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-d5ppn"] Oct 02 11:47:29 crc kubenswrapper[4658]: I1002 11:47:29.037247 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-d5ppn"] Oct 02 11:47:29 crc kubenswrapper[4658]: I1002 11:47:29.960091 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057d8045-79f8-4f4d-9b29-ce1f517e0f94" path="/var/lib/kubelet/pods/057d8045-79f8-4f4d-9b29-ce1f517e0f94/volumes" Oct 02 11:47:31 crc kubenswrapper[4658]: I1002 11:47:31.030594 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-s6w77"] Oct 02 11:47:31 crc kubenswrapper[4658]: I1002 11:47:31.039562 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-s6w77"] Oct 02 11:47:31 crc kubenswrapper[4658]: I1002 11:47:31.962644 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6378c687-5c50-4efd-8cc5-b7aa4ef82297" path="/var/lib/kubelet/pods/6378c687-5c50-4efd-8cc5-b7aa4ef82297/volumes" Oct 02 11:47:42 crc kubenswrapper[4658]: I1002 11:47:42.949383 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:47:42 crc kubenswrapper[4658]: E1002 11:47:42.950537 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:47:55 crc kubenswrapper[4658]: I1002 11:47:55.952386 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:47:55 crc kubenswrapper[4658]: E1002 11:47:55.953173 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:48:10 crc kubenswrapper[4658]: I1002 11:48:10.402730 4658 generic.go:334] "Generic (PLEG): container finished" podID="6eed4da6-fdf5-4db6-9e72-1d3052a54482" containerID="eaae52704d260c4e23b70a782df4c98ac7a776c883d86aa4ed9c7da2960f5095" exitCode=0 Oct 02 11:48:10 crc kubenswrapper[4658]: I1002 11:48:10.403421 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" event={"ID":"6eed4da6-fdf5-4db6-9e72-1d3052a54482","Type":"ContainerDied","Data":"eaae52704d260c4e23b70a782df4c98ac7a776c883d86aa4ed9c7da2960f5095"} Oct 02 11:48:10 crc kubenswrapper[4658]: I1002 11:48:10.949621 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:48:10 crc kubenswrapper[4658]: E1002 11:48:10.949904 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:48:11 crc kubenswrapper[4658]: I1002 11:48:11.880342 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.020639 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-ssh-key\") pod \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.020740 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqcb4\" (UniqueName: \"kubernetes.io/projected/6eed4da6-fdf5-4db6-9e72-1d3052a54482-kube-api-access-pqcb4\") pod \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.020856 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-inventory\") pod \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\" (UID: \"6eed4da6-fdf5-4db6-9e72-1d3052a54482\") " Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.029707 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eed4da6-fdf5-4db6-9e72-1d3052a54482-kube-api-access-pqcb4" (OuterVolumeSpecName: "kube-api-access-pqcb4") pod "6eed4da6-fdf5-4db6-9e72-1d3052a54482" (UID: "6eed4da6-fdf5-4db6-9e72-1d3052a54482"). InnerVolumeSpecName "kube-api-access-pqcb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.043957 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-85tpz"] Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.053014 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-inventory" (OuterVolumeSpecName: "inventory") pod "6eed4da6-fdf5-4db6-9e72-1d3052a54482" (UID: "6eed4da6-fdf5-4db6-9e72-1d3052a54482"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.056714 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6eed4da6-fdf5-4db6-9e72-1d3052a54482" (UID: "6eed4da6-fdf5-4db6-9e72-1d3052a54482"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.057766 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-85tpz"] Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.124122 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqcb4\" (UniqueName: \"kubernetes.io/projected/6eed4da6-fdf5-4db6-9e72-1d3052a54482-kube-api-access-pqcb4\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.124159 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.124173 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eed4da6-fdf5-4db6-9e72-1d3052a54482-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.425103 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" event={"ID":"6eed4da6-fdf5-4db6-9e72-1d3052a54482","Type":"ContainerDied","Data":"1accf182787f8ce0d34e8e091373ebce028a836cb4d5c6d759fa5c56eb066861"} Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.425654 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1accf182787f8ce0d34e8e091373ebce028a836cb4d5c6d759fa5c56eb066861" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.425184 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.522485 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5"] Oct 02 11:48:12 crc kubenswrapper[4658]: E1002 11:48:12.523039 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eed4da6-fdf5-4db6-9e72-1d3052a54482" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.523067 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eed4da6-fdf5-4db6-9e72-1d3052a54482" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.523419 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eed4da6-fdf5-4db6-9e72-1d3052a54482" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.527141 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.530667 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.531019 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.531402 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.531327 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.538014 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5"] Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.634338 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zp2x\" (UniqueName: \"kubernetes.io/projected/bfda0e17-a4e9-4a4f-9678-418901ed432a-kube-api-access-2zp2x\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.635139 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.636477 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.738765 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.738882 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.738921 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zp2x\" (UniqueName: \"kubernetes.io/projected/bfda0e17-a4e9-4a4f-9678-418901ed432a-kube-api-access-2zp2x\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.743208 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.744776 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.755440 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zp2x\" (UniqueName: \"kubernetes.io/projected/bfda0e17-a4e9-4a4f-9678-418901ed432a-kube-api-access-2zp2x\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:12 crc kubenswrapper[4658]: I1002 11:48:12.868632 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:13 crc kubenswrapper[4658]: I1002 11:48:13.041677 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-gt5m2"] Oct 02 11:48:13 crc kubenswrapper[4658]: I1002 11:48:13.050819 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-gt5m2"] Oct 02 11:48:13 crc kubenswrapper[4658]: I1002 11:48:13.062048 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mrrgm"] Oct 02 11:48:13 crc kubenswrapper[4658]: I1002 11:48:13.071574 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mrrgm"] Oct 02 11:48:13 crc kubenswrapper[4658]: I1002 11:48:13.398829 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5"] Oct 02 11:48:13 crc kubenswrapper[4658]: I1002 11:48:13.436054 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" event={"ID":"bfda0e17-a4e9-4a4f-9678-418901ed432a","Type":"ContainerStarted","Data":"f48c57e40d616bd4d3d9e30dcae12b0c46ad66a2527036461fc9cae50b28a350"} Oct 02 11:48:13 crc kubenswrapper[4658]: I1002 11:48:13.959662 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08cd2749-20f1-4836-ac61-62b7d555a3b3" path="/var/lib/kubelet/pods/08cd2749-20f1-4836-ac61-62b7d555a3b3/volumes" Oct 02 11:48:13 crc kubenswrapper[4658]: I1002 11:48:13.960167 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5806c84d-2c8f-402d-9487-656bd2936933" path="/var/lib/kubelet/pods/5806c84d-2c8f-402d-9487-656bd2936933/volumes" Oct 02 11:48:13 crc kubenswrapper[4658]: I1002 11:48:13.960915 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6473d21f-8a15-443f-b5ac-2211e1cf0e55" path="/var/lib/kubelet/pods/6473d21f-8a15-443f-b5ac-2211e1cf0e55/volumes" Oct 02 11:48:14 crc kubenswrapper[4658]: I1002 11:48:14.450702 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" event={"ID":"bfda0e17-a4e9-4a4f-9678-418901ed432a","Type":"ContainerStarted","Data":"a216354160cf2c95800c63c57bc0ec86ff29d5b3e38534cd76465ffdc7ae1ed6"} Oct 02 11:48:14 crc kubenswrapper[4658]: I1002 11:48:14.476586 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" podStartSLOduration=1.8923048470000001 podStartE2EDuration="2.476568329s" podCreationTimestamp="2025-10-02 11:48:12 +0000 UTC" firstStartedPulling="2025-10-02 11:48:13.405677603 +0000 UTC m=+1774.296831170" lastFinishedPulling="2025-10-02 11:48:13.989941085 +0000 UTC m=+1774.881094652" observedRunningTime="2025-10-02 11:48:14.468058735 +0000 UTC m=+1775.359212302" watchObservedRunningTime="2025-10-02 11:48:14.476568329 +0000 UTC m=+1775.367721896" Oct 02 11:48:19 crc kubenswrapper[4658]: I1002 11:48:19.502915 4658 generic.go:334] "Generic (PLEG): container finished" podID="bfda0e17-a4e9-4a4f-9678-418901ed432a" containerID="a216354160cf2c95800c63c57bc0ec86ff29d5b3e38534cd76465ffdc7ae1ed6" exitCode=0 Oct 02 11:48:19 crc kubenswrapper[4658]: I1002 11:48:19.503590 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" event={"ID":"bfda0e17-a4e9-4a4f-9678-418901ed432a","Type":"ContainerDied","Data":"a216354160cf2c95800c63c57bc0ec86ff29d5b3e38534cd76465ffdc7ae1ed6"} Oct 02 11:48:20 crc kubenswrapper[4658]: I1002 11:48:20.890761 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.011729 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-ssh-key\") pod \"bfda0e17-a4e9-4a4f-9678-418901ed432a\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.012500 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zp2x\" (UniqueName: \"kubernetes.io/projected/bfda0e17-a4e9-4a4f-9678-418901ed432a-kube-api-access-2zp2x\") pod \"bfda0e17-a4e9-4a4f-9678-418901ed432a\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.012682 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-inventory\") pod \"bfda0e17-a4e9-4a4f-9678-418901ed432a\" (UID: \"bfda0e17-a4e9-4a4f-9678-418901ed432a\") " Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.021495 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfda0e17-a4e9-4a4f-9678-418901ed432a-kube-api-access-2zp2x" (OuterVolumeSpecName: "kube-api-access-2zp2x") pod "bfda0e17-a4e9-4a4f-9678-418901ed432a" (UID: "bfda0e17-a4e9-4a4f-9678-418901ed432a"). InnerVolumeSpecName "kube-api-access-2zp2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.051536 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e5d5-account-create-pwtn4"] Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.060451 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-inventory" (OuterVolumeSpecName: "inventory") pod "bfda0e17-a4e9-4a4f-9678-418901ed432a" (UID: "bfda0e17-a4e9-4a4f-9678-418901ed432a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.062506 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bfda0e17-a4e9-4a4f-9678-418901ed432a" (UID: "bfda0e17-a4e9-4a4f-9678-418901ed432a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.063116 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1e6d-account-create-mg5kn"] Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.073118 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-20ab-account-create-swrjh"] Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.081069 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e5d5-account-create-pwtn4"] Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.089447 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-20ab-account-create-swrjh"] Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.096653 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1e6d-account-create-mg5kn"] Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.115334 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.115374 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfda0e17-a4e9-4a4f-9678-418901ed432a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.115384 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zp2x\" (UniqueName: \"kubernetes.io/projected/bfda0e17-a4e9-4a4f-9678-418901ed432a-kube-api-access-2zp2x\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.526252 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" event={"ID":"bfda0e17-a4e9-4a4f-9678-418901ed432a","Type":"ContainerDied","Data":"f48c57e40d616bd4d3d9e30dcae12b0c46ad66a2527036461fc9cae50b28a350"} Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.526312 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f48c57e40d616bd4d3d9e30dcae12b0c46ad66a2527036461fc9cae50b28a350" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.526314 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.591884 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7"] Oct 02 11:48:21 crc kubenswrapper[4658]: E1002 11:48:21.592358 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfda0e17-a4e9-4a4f-9678-418901ed432a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.592384 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfda0e17-a4e9-4a4f-9678-418901ed432a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.592667 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfda0e17-a4e9-4a4f-9678-418901ed432a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.593556 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.598521 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.598634 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.598791 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.598818 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.605072 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7"] Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.626335 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xk4g7\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.626769 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lphn7\" (UniqueName: \"kubernetes.io/projected/59aa0d09-3a44-4e0a-b2d2-7f297a223854-kube-api-access-lphn7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xk4g7\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.627094 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xk4g7\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.729133 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lphn7\" (UniqueName: \"kubernetes.io/projected/59aa0d09-3a44-4e0a-b2d2-7f297a223854-kube-api-access-lphn7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xk4g7\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.729327 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xk4g7\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.729374 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xk4g7\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.734037 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xk4g7\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.734067 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xk4g7\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.746339 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lphn7\" (UniqueName: \"kubernetes.io/projected/59aa0d09-3a44-4e0a-b2d2-7f297a223854-kube-api-access-lphn7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xk4g7\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.940739 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.959816 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42121d24-1598-4e99-890c-9e74b7576895" path="/var/lib/kubelet/pods/42121d24-1598-4e99-890c-9e74b7576895/volumes" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.961280 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9389f8bc-6161-444c-8d9c-712f0e494c99" path="/var/lib/kubelet/pods/9389f8bc-6161-444c-8d9c-712f0e494c99/volumes" Oct 02 11:48:21 crc kubenswrapper[4658]: I1002 11:48:21.961918 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="990db662-1228-4b71-92fb-af4f1aad1d79" path="/var/lib/kubelet/pods/990db662-1228-4b71-92fb-af4f1aad1d79/volumes" Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.500503 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7"] Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.539470 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" event={"ID":"59aa0d09-3a44-4e0a-b2d2-7f297a223854","Type":"ContainerStarted","Data":"9be3744df5e19e3ce69efe15f3b1fd0c4958db120b444fa03f2b5969991cbaf7"} Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.626923 4658 scope.go:117] "RemoveContainer" containerID="f9b1fba6315acfc7dd04f38b51428968bfb5073789c75961311d897284c21eaa" Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.648502 4658 scope.go:117] "RemoveContainer" containerID="a51e06eefc7e7d1905e64b87fda225406a015cb1b488d1028ba7c11302952bad" Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.705334 4658 scope.go:117] "RemoveContainer" containerID="c0db916e67cae1a0a9338a4d1bc79c90fb4ad51c67a1febc4d5e41742cbd2836" Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.727436 4658 scope.go:117] "RemoveContainer" containerID="460e2a82155c4f23a1957b27c27d2355d7052d7b2c2469a2a205ffc0b734ee04" Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.748206 4658 scope.go:117] "RemoveContainer" containerID="cbe52a82df437a48099dba564b91a3aae02a948123a6b805fa211187fc907aa6" Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.770268 4658 scope.go:117] "RemoveContainer" containerID="8c9d3ad474bcf1ab2f6607b84cffeea01b7f397c61d8c0df65a7a8b23a17aaa5" Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.799751 4658 scope.go:117] "RemoveContainer" containerID="c8979b246183ec945438779ca471a0c822952f7e550171d5eafbd4ef9e5fdb26" Oct 02 11:48:22 crc kubenswrapper[4658]: I1002 11:48:22.854739 4658 scope.go:117] "RemoveContainer" containerID="0e571d20008829a47fd4be592c00b67333af9598c5d53ca600d02c8ff788d8e4" Oct 02 11:48:23 crc kubenswrapper[4658]: I1002 11:48:23.576026 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" event={"ID":"59aa0d09-3a44-4e0a-b2d2-7f297a223854","Type":"ContainerStarted","Data":"311002fa7fa441b24fd72ca13f78198e833aa52fa68999b5091a6ecd0eae3ae7"} Oct 02 11:48:23 crc kubenswrapper[4658]: I1002 11:48:23.601866 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" podStartSLOduration=2.187282025 podStartE2EDuration="2.601840186s" podCreationTimestamp="2025-10-02 11:48:21 +0000 UTC" firstStartedPulling="2025-10-02 11:48:22.50824433 +0000 UTC m=+1783.399397897" lastFinishedPulling="2025-10-02 11:48:22.922802491 +0000 UTC m=+1783.813956058" observedRunningTime="2025-10-02 11:48:23.594256682 +0000 UTC m=+1784.485410249" watchObservedRunningTime="2025-10-02 11:48:23.601840186 +0000 UTC m=+1784.492993753" Oct 02 11:48:25 crc kubenswrapper[4658]: I1002 11:48:25.949799 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:48:25 crc kubenswrapper[4658]: E1002 11:48:25.951049 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:48:38 crc kubenswrapper[4658]: I1002 11:48:38.948840 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:48:38 crc kubenswrapper[4658]: E1002 11:48:38.949811 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:48:48 crc kubenswrapper[4658]: I1002 11:48:48.044464 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-55974"] Oct 02 11:48:48 crc kubenswrapper[4658]: I1002 11:48:48.054383 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-55974"] Oct 02 11:48:49 crc kubenswrapper[4658]: I1002 11:48:49.955467 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:48:49 crc kubenswrapper[4658]: E1002 11:48:49.955752 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:48:49 crc kubenswrapper[4658]: I1002 11:48:49.960105 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e2ba1e-bb1f-4770-a261-979b3f467bce" path="/var/lib/kubelet/pods/f4e2ba1e-bb1f-4770-a261-979b3f467bce/volumes" Oct 02 11:49:00 crc kubenswrapper[4658]: I1002 11:49:00.938930 4658 generic.go:334] "Generic (PLEG): container finished" podID="59aa0d09-3a44-4e0a-b2d2-7f297a223854" containerID="311002fa7fa441b24fd72ca13f78198e833aa52fa68999b5091a6ecd0eae3ae7" exitCode=0 Oct 02 11:49:00 crc kubenswrapper[4658]: I1002 11:49:00.939021 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" event={"ID":"59aa0d09-3a44-4e0a-b2d2-7f297a223854","Type":"ContainerDied","Data":"311002fa7fa441b24fd72ca13f78198e833aa52fa68999b5091a6ecd0eae3ae7"} Oct 02 11:49:00 crc kubenswrapper[4658]: I1002 11:49:00.949361 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:49:00 crc kubenswrapper[4658]: E1002 11:49:00.949835 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.492436 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.580120 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lphn7\" (UniqueName: \"kubernetes.io/projected/59aa0d09-3a44-4e0a-b2d2-7f297a223854-kube-api-access-lphn7\") pod \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.580393 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-ssh-key\") pod \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.580455 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-inventory\") pod \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\" (UID: \"59aa0d09-3a44-4e0a-b2d2-7f297a223854\") " Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.589545 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59aa0d09-3a44-4e0a-b2d2-7f297a223854-kube-api-access-lphn7" (OuterVolumeSpecName: "kube-api-access-lphn7") pod "59aa0d09-3a44-4e0a-b2d2-7f297a223854" (UID: "59aa0d09-3a44-4e0a-b2d2-7f297a223854"). InnerVolumeSpecName "kube-api-access-lphn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.629876 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59aa0d09-3a44-4e0a-b2d2-7f297a223854" (UID: "59aa0d09-3a44-4e0a-b2d2-7f297a223854"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.630352 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-inventory" (OuterVolumeSpecName: "inventory") pod "59aa0d09-3a44-4e0a-b2d2-7f297a223854" (UID: "59aa0d09-3a44-4e0a-b2d2-7f297a223854"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.683070 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lphn7\" (UniqueName: \"kubernetes.io/projected/59aa0d09-3a44-4e0a-b2d2-7f297a223854-kube-api-access-lphn7\") on node \"crc\" DevicePath \"\"" Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.683111 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.683128 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59aa0d09-3a44-4e0a-b2d2-7f297a223854-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.961952 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" event={"ID":"59aa0d09-3a44-4e0a-b2d2-7f297a223854","Type":"ContainerDied","Data":"9be3744df5e19e3ce69efe15f3b1fd0c4958db120b444fa03f2b5969991cbaf7"} Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.961989 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9be3744df5e19e3ce69efe15f3b1fd0c4958db120b444fa03f2b5969991cbaf7" Oct 02 11:49:02 crc kubenswrapper[4658]: I1002 11:49:02.962044 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xk4g7" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.065399 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p"] Oct 02 11:49:03 crc kubenswrapper[4658]: E1002 11:49:03.066388 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59aa0d09-3a44-4e0a-b2d2-7f297a223854" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.066482 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="59aa0d09-3a44-4e0a-b2d2-7f297a223854" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.066793 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="59aa0d09-3a44-4e0a-b2d2-7f297a223854" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.067756 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.070518 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.070742 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.070870 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.072904 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.087351 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p"] Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.195382 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.195432 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n27cx\" (UniqueName: \"kubernetes.io/projected/c5792ae0-4758-472c-94b6-b4f313cc3462-kube-api-access-n27cx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.195480 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.297690 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n27cx\" (UniqueName: \"kubernetes.io/projected/c5792ae0-4758-472c-94b6-b4f313cc3462-kube-api-access-n27cx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.297784 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.297924 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.306709 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.306889 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.326427 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n27cx\" (UniqueName: \"kubernetes.io/projected/c5792ae0-4758-472c-94b6-b4f313cc3462-kube-api-access-n27cx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.385549 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.943801 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p"] Oct 02 11:49:03 crc kubenswrapper[4658]: I1002 11:49:03.972748 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" event={"ID":"c5792ae0-4758-472c-94b6-b4f313cc3462","Type":"ContainerStarted","Data":"5ef557bea3104f7b3f7f699e37168b1e79372d366c370cccfe186815afefe8d0"} Oct 02 11:49:04 crc kubenswrapper[4658]: I1002 11:49:04.994899 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" event={"ID":"c5792ae0-4758-472c-94b6-b4f313cc3462","Type":"ContainerStarted","Data":"9a78a0c9c50cb35548acbe6c5294ce306be68df2a9422f849cbca77085d72deb"} Oct 02 11:49:05 crc kubenswrapper[4658]: I1002 11:49:05.018108 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" podStartSLOduration=1.464895713 podStartE2EDuration="2.018086503s" podCreationTimestamp="2025-10-02 11:49:03 +0000 UTC" firstStartedPulling="2025-10-02 11:49:03.957559862 +0000 UTC m=+1824.848713429" lastFinishedPulling="2025-10-02 11:49:04.510750642 +0000 UTC m=+1825.401904219" observedRunningTime="2025-10-02 11:49:05.010773357 +0000 UTC m=+1825.901926934" watchObservedRunningTime="2025-10-02 11:49:05.018086503 +0000 UTC m=+1825.909240070" Oct 02 11:49:13 crc kubenswrapper[4658]: I1002 11:49:13.040600 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-skqqc"] Oct 02 11:49:13 crc kubenswrapper[4658]: I1002 11:49:13.052543 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-skqqc"] Oct 02 11:49:13 crc kubenswrapper[4658]: I1002 11:49:13.961187 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86fefaf1-a889-4b79-b9bf-e53d04639c2e" path="/var/lib/kubelet/pods/86fefaf1-a889-4b79-b9bf-e53d04639c2e/volumes" Oct 02 11:49:14 crc kubenswrapper[4658]: I1002 11:49:14.949699 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:49:14 crc kubenswrapper[4658]: E1002 11:49:14.949980 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:49:18 crc kubenswrapper[4658]: I1002 11:49:18.039387 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9jhhj"] Oct 02 11:49:18 crc kubenswrapper[4658]: I1002 11:49:18.053953 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9jhhj"] Oct 02 11:49:19 crc kubenswrapper[4658]: I1002 11:49:19.961178 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a2866b-59b0-47dc-b036-cb6f5c08bd40" path="/var/lib/kubelet/pods/58a2866b-59b0-47dc-b036-cb6f5c08bd40/volumes" Oct 02 11:49:23 crc kubenswrapper[4658]: I1002 11:49:23.101189 4658 scope.go:117] "RemoveContainer" containerID="68e4a88a6165fdf99aca95846caaccdd2d173ccb2aaa09e4dc4623a9c1a01c17" Oct 02 11:49:23 crc kubenswrapper[4658]: I1002 11:49:23.152758 4658 scope.go:117] "RemoveContainer" containerID="5c5ec061387b5c2f557bb6fafd067a897cbb290cd247047bc8df3d28fc67117a" Oct 02 11:49:23 crc kubenswrapper[4658]: I1002 11:49:23.209582 4658 scope.go:117] "RemoveContainer" containerID="091b5f3ac43ac2ce0f6435c8dcd12c903bae95419f321d90aef8537cfb4b423e" Oct 02 11:49:26 crc kubenswrapper[4658]: I1002 11:49:26.949856 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:49:26 crc kubenswrapper[4658]: E1002 11:49:26.950522 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:49:38 crc kubenswrapper[4658]: I1002 11:49:38.949930 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:49:39 crc kubenswrapper[4658]: I1002 11:49:39.317620 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"7a47b4f1ee22e57466ef65cda1906555215a872b918f678a1cf99fade8b5c597"} Oct 02 11:49:57 crc kubenswrapper[4658]: I1002 11:49:57.050446 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fvg7b"] Oct 02 11:49:57 crc kubenswrapper[4658]: I1002 11:49:57.056970 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fvg7b"] Oct 02 11:49:57 crc kubenswrapper[4658]: I1002 11:49:57.966011 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf757ce-6767-4bed-98a4-394baf2cc6f8" path="/var/lib/kubelet/pods/8bf757ce-6767-4bed-98a4-394baf2cc6f8/volumes" Oct 02 11:50:00 crc kubenswrapper[4658]: I1002 11:50:00.525363 4658 generic.go:334] "Generic (PLEG): container finished" podID="c5792ae0-4758-472c-94b6-b4f313cc3462" containerID="9a78a0c9c50cb35548acbe6c5294ce306be68df2a9422f849cbca77085d72deb" exitCode=2 Oct 02 11:50:00 crc kubenswrapper[4658]: I1002 11:50:00.525422 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" event={"ID":"c5792ae0-4758-472c-94b6-b4f313cc3462","Type":"ContainerDied","Data":"9a78a0c9c50cb35548acbe6c5294ce306be68df2a9422f849cbca77085d72deb"} Oct 02 11:50:01 crc kubenswrapper[4658]: I1002 11:50:01.984958 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.059427 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-ssh-key\") pod \"c5792ae0-4758-472c-94b6-b4f313cc3462\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.059562 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-inventory\") pod \"c5792ae0-4758-472c-94b6-b4f313cc3462\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.059661 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n27cx\" (UniqueName: \"kubernetes.io/projected/c5792ae0-4758-472c-94b6-b4f313cc3462-kube-api-access-n27cx\") pod \"c5792ae0-4758-472c-94b6-b4f313cc3462\" (UID: \"c5792ae0-4758-472c-94b6-b4f313cc3462\") " Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.074436 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5792ae0-4758-472c-94b6-b4f313cc3462-kube-api-access-n27cx" (OuterVolumeSpecName: "kube-api-access-n27cx") pod "c5792ae0-4758-472c-94b6-b4f313cc3462" (UID: "c5792ae0-4758-472c-94b6-b4f313cc3462"). InnerVolumeSpecName "kube-api-access-n27cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.102777 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c5792ae0-4758-472c-94b6-b4f313cc3462" (UID: "c5792ae0-4758-472c-94b6-b4f313cc3462"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.114570 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-inventory" (OuterVolumeSpecName: "inventory") pod "c5792ae0-4758-472c-94b6-b4f313cc3462" (UID: "c5792ae0-4758-472c-94b6-b4f313cc3462"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.162502 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.162541 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5792ae0-4758-472c-94b6-b4f313cc3462-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.162556 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n27cx\" (UniqueName: \"kubernetes.io/projected/c5792ae0-4758-472c-94b6-b4f313cc3462-kube-api-access-n27cx\") on node \"crc\" DevicePath \"\"" Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.549398 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" event={"ID":"c5792ae0-4758-472c-94b6-b4f313cc3462","Type":"ContainerDied","Data":"5ef557bea3104f7b3f7f699e37168b1e79372d366c370cccfe186815afefe8d0"} Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.549442 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p" Oct 02 11:50:02 crc kubenswrapper[4658]: I1002 11:50:02.549454 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ef557bea3104f7b3f7f699e37168b1e79372d366c370cccfe186815afefe8d0" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.034778 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml"] Oct 02 11:50:10 crc kubenswrapper[4658]: E1002 11:50:10.035828 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5792ae0-4758-472c-94b6-b4f313cc3462" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.035844 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5792ae0-4758-472c-94b6-b4f313cc3462" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.036027 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5792ae0-4758-472c-94b6-b4f313cc3462" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.036840 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.039338 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.040069 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.041792 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.041865 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.062065 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml"] Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.138026 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jffdk\" (UniqueName: \"kubernetes.io/projected/09073a04-723b-4564-8f3a-efbc628cb7ef-kube-api-access-jffdk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzzml\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.138150 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzzml\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.138193 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzzml\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.240440 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jffdk\" (UniqueName: \"kubernetes.io/projected/09073a04-723b-4564-8f3a-efbc628cb7ef-kube-api-access-jffdk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzzml\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.240646 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzzml\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.240713 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzzml\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.247446 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzzml\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.248832 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzzml\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.269108 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jffdk\" (UniqueName: \"kubernetes.io/projected/09073a04-723b-4564-8f3a-efbc628cb7ef-kube-api-access-jffdk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzzml\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.386799 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.808566 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml"] Oct 02 11:50:10 crc kubenswrapper[4658]: I1002 11:50:10.823545 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:50:11 crc kubenswrapper[4658]: I1002 11:50:11.630043 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" event={"ID":"09073a04-723b-4564-8f3a-efbc628cb7ef","Type":"ContainerStarted","Data":"8ac389c4200b775eba004452c105dcbe62eb32116b304ab9e7f4e57c3d53cfce"} Oct 02 11:50:11 crc kubenswrapper[4658]: I1002 11:50:11.630480 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" event={"ID":"09073a04-723b-4564-8f3a-efbc628cb7ef","Type":"ContainerStarted","Data":"152d315731ff0ea580418a24688fb8336f062cd9345c2783f96396ea63beb2c1"} Oct 02 11:50:23 crc kubenswrapper[4658]: I1002 11:50:23.324000 4658 scope.go:117] "RemoveContainer" containerID="4f1378482fc5e3aa5b781216f0c7930df6346459804584d54143aa11ee81ebe8" Oct 02 11:50:58 crc kubenswrapper[4658]: I1002 11:50:58.052565 4658 generic.go:334] "Generic (PLEG): container finished" podID="09073a04-723b-4564-8f3a-efbc628cb7ef" containerID="8ac389c4200b775eba004452c105dcbe62eb32116b304ab9e7f4e57c3d53cfce" exitCode=0 Oct 02 11:50:58 crc kubenswrapper[4658]: I1002 11:50:58.052625 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" event={"ID":"09073a04-723b-4564-8f3a-efbc628cb7ef","Type":"ContainerDied","Data":"8ac389c4200b775eba004452c105dcbe62eb32116b304ab9e7f4e57c3d53cfce"} Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.606882 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.761739 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-ssh-key\") pod \"09073a04-723b-4564-8f3a-efbc628cb7ef\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.762453 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jffdk\" (UniqueName: \"kubernetes.io/projected/09073a04-723b-4564-8f3a-efbc628cb7ef-kube-api-access-jffdk\") pod \"09073a04-723b-4564-8f3a-efbc628cb7ef\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.762618 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-inventory\") pod \"09073a04-723b-4564-8f3a-efbc628cb7ef\" (UID: \"09073a04-723b-4564-8f3a-efbc628cb7ef\") " Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.774491 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09073a04-723b-4564-8f3a-efbc628cb7ef-kube-api-access-jffdk" (OuterVolumeSpecName: "kube-api-access-jffdk") pod "09073a04-723b-4564-8f3a-efbc628cb7ef" (UID: "09073a04-723b-4564-8f3a-efbc628cb7ef"). InnerVolumeSpecName "kube-api-access-jffdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.787963 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "09073a04-723b-4564-8f3a-efbc628cb7ef" (UID: "09073a04-723b-4564-8f3a-efbc628cb7ef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.799654 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-inventory" (OuterVolumeSpecName: "inventory") pod "09073a04-723b-4564-8f3a-efbc628cb7ef" (UID: "09073a04-723b-4564-8f3a-efbc628cb7ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.865810 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.866087 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jffdk\" (UniqueName: \"kubernetes.io/projected/09073a04-723b-4564-8f3a-efbc628cb7ef-kube-api-access-jffdk\") on node \"crc\" DevicePath \"\"" Oct 02 11:50:59 crc kubenswrapper[4658]: I1002 11:50:59.866183 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09073a04-723b-4564-8f3a-efbc628cb7ef-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.074629 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" event={"ID":"09073a04-723b-4564-8f3a-efbc628cb7ef","Type":"ContainerDied","Data":"152d315731ff0ea580418a24688fb8336f062cd9345c2783f96396ea63beb2c1"} Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.074893 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152d315731ff0ea580418a24688fb8336f062cd9345c2783f96396ea63beb2c1" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.074719 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzzml" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.171506 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz25k"] Oct 02 11:51:00 crc kubenswrapper[4658]: E1002 11:51:00.172010 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09073a04-723b-4564-8f3a-efbc628cb7ef" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.172030 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="09073a04-723b-4564-8f3a-efbc628cb7ef" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.172228 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="09073a04-723b-4564-8f3a-efbc628cb7ef" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.173044 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.177406 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.177477 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.177406 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.177411 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.192567 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz25k"] Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.273758 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz25k\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.273958 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz25k\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.274185 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kbll\" (UniqueName: \"kubernetes.io/projected/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-kube-api-access-5kbll\") pod \"ssh-known-hosts-edpm-deployment-dz25k\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.376571 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kbll\" (UniqueName: \"kubernetes.io/projected/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-kube-api-access-5kbll\") pod \"ssh-known-hosts-edpm-deployment-dz25k\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.376686 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz25k\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.376784 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz25k\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.380873 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz25k\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.396051 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz25k\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.401167 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kbll\" (UniqueName: \"kubernetes.io/projected/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-kube-api-access-5kbll\") pod \"ssh-known-hosts-edpm-deployment-dz25k\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:00 crc kubenswrapper[4658]: I1002 11:51:00.495747 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:01 crc kubenswrapper[4658]: I1002 11:51:01.041424 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz25k"] Oct 02 11:51:01 crc kubenswrapper[4658]: I1002 11:51:01.089918 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" event={"ID":"5a2e4e7a-11ed-4e29-b2f3-28919813fa63","Type":"ContainerStarted","Data":"9883b2cf9c0d134557c033d9b608576c62537190d354019a2a73dd2bbe55e0cb"} Oct 02 11:51:03 crc kubenswrapper[4658]: I1002 11:51:03.111270 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" event={"ID":"5a2e4e7a-11ed-4e29-b2f3-28919813fa63","Type":"ContainerStarted","Data":"aa9defc34df316d742662ec59bd7cce01ef64b0408dcd50f08f825f3b117d952"} Oct 02 11:51:03 crc kubenswrapper[4658]: I1002 11:51:03.144487 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" podStartSLOduration=1.717179733 podStartE2EDuration="3.144468862s" podCreationTimestamp="2025-10-02 11:51:00 +0000 UTC" firstStartedPulling="2025-10-02 11:51:01.051240148 +0000 UTC m=+1941.942393715" lastFinishedPulling="2025-10-02 11:51:02.478529277 +0000 UTC m=+1943.369682844" observedRunningTime="2025-10-02 11:51:03.136714644 +0000 UTC m=+1944.027868211" watchObservedRunningTime="2025-10-02 11:51:03.144468862 +0000 UTC m=+1944.035622449" Oct 02 11:51:10 crc kubenswrapper[4658]: I1002 11:51:10.179911 4658 generic.go:334] "Generic (PLEG): container finished" podID="5a2e4e7a-11ed-4e29-b2f3-28919813fa63" containerID="aa9defc34df316d742662ec59bd7cce01ef64b0408dcd50f08f825f3b117d952" exitCode=0 Oct 02 11:51:10 crc kubenswrapper[4658]: I1002 11:51:10.179991 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" event={"ID":"5a2e4e7a-11ed-4e29-b2f3-28919813fa63","Type":"ContainerDied","Data":"aa9defc34df316d742662ec59bd7cce01ef64b0408dcd50f08f825f3b117d952"} Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.635140 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.831453 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-inventory-0\") pod \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.831759 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-ssh-key-openstack-edpm-ipam\") pod \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.831803 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kbll\" (UniqueName: \"kubernetes.io/projected/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-kube-api-access-5kbll\") pod \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\" (UID: \"5a2e4e7a-11ed-4e29-b2f3-28919813fa63\") " Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.854495 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-kube-api-access-5kbll" (OuterVolumeSpecName: "kube-api-access-5kbll") pod "5a2e4e7a-11ed-4e29-b2f3-28919813fa63" (UID: "5a2e4e7a-11ed-4e29-b2f3-28919813fa63"). InnerVolumeSpecName "kube-api-access-5kbll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.859444 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5a2e4e7a-11ed-4e29-b2f3-28919813fa63" (UID: "5a2e4e7a-11ed-4e29-b2f3-28919813fa63"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.885429 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "5a2e4e7a-11ed-4e29-b2f3-28919813fa63" (UID: "5a2e4e7a-11ed-4e29-b2f3-28919813fa63"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.935412 4658 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.935455 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:11 crc kubenswrapper[4658]: I1002 11:51:11.935469 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kbll\" (UniqueName: \"kubernetes.io/projected/5a2e4e7a-11ed-4e29-b2f3-28919813fa63-kube-api-access-5kbll\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.200569 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" event={"ID":"5a2e4e7a-11ed-4e29-b2f3-28919813fa63","Type":"ContainerDied","Data":"9883b2cf9c0d134557c033d9b608576c62537190d354019a2a73dd2bbe55e0cb"} Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.201167 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9883b2cf9c0d134557c033d9b608576c62537190d354019a2a73dd2bbe55e0cb" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.200682 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz25k" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.282758 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh"] Oct 02 11:51:12 crc kubenswrapper[4658]: E1002 11:51:12.283647 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2e4e7a-11ed-4e29-b2f3-28919813fa63" containerName="ssh-known-hosts-edpm-deployment" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.283742 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2e4e7a-11ed-4e29-b2f3-28919813fa63" containerName="ssh-known-hosts-edpm-deployment" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.284100 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a2e4e7a-11ed-4e29-b2f3-28919813fa63" containerName="ssh-known-hosts-edpm-deployment" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.285021 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.287247 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.287590 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.287747 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.287964 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.298252 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh"] Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.443230 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkhqn\" (UniqueName: \"kubernetes.io/projected/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-kube-api-access-mkhqn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bztkh\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.443759 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bztkh\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.443815 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bztkh\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.545907 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkhqn\" (UniqueName: \"kubernetes.io/projected/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-kube-api-access-mkhqn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bztkh\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.546072 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bztkh\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.546152 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bztkh\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.552487 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bztkh\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.553476 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bztkh\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.577909 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkhqn\" (UniqueName: \"kubernetes.io/projected/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-kube-api-access-mkhqn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bztkh\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:12 crc kubenswrapper[4658]: I1002 11:51:12.613145 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:13 crc kubenswrapper[4658]: I1002 11:51:13.174128 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh"] Oct 02 11:51:13 crc kubenswrapper[4658]: I1002 11:51:13.214701 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" event={"ID":"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a","Type":"ContainerStarted","Data":"7421a2200544aeb0e1c8bfd9e6dccff412aa3f766d952acc37189967f491e5a7"} Oct 02 11:51:14 crc kubenswrapper[4658]: I1002 11:51:14.224836 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" event={"ID":"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a","Type":"ContainerStarted","Data":"c014d3f2f1454935f95842f49928a2820cff6fcf7c7775396803c27513a560ca"} Oct 02 11:51:14 crc kubenswrapper[4658]: I1002 11:51:14.246726 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" podStartSLOduration=1.584183406 podStartE2EDuration="2.246707542s" podCreationTimestamp="2025-10-02 11:51:12 +0000 UTC" firstStartedPulling="2025-10-02 11:51:13.185166513 +0000 UTC m=+1954.076320080" lastFinishedPulling="2025-10-02 11:51:13.847690649 +0000 UTC m=+1954.738844216" observedRunningTime="2025-10-02 11:51:14.241348791 +0000 UTC m=+1955.132502388" watchObservedRunningTime="2025-10-02 11:51:14.246707542 +0000 UTC m=+1955.137861109" Oct 02 11:51:23 crc kubenswrapper[4658]: I1002 11:51:23.317657 4658 generic.go:334] "Generic (PLEG): container finished" podID="26a7e52d-c3b7-4a7d-ae46-c2f32adb479a" containerID="c014d3f2f1454935f95842f49928a2820cff6fcf7c7775396803c27513a560ca" exitCode=0 Oct 02 11:51:23 crc kubenswrapper[4658]: I1002 11:51:23.317749 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" event={"ID":"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a","Type":"ContainerDied","Data":"c014d3f2f1454935f95842f49928a2820cff6fcf7c7775396803c27513a560ca"} Oct 02 11:51:24 crc kubenswrapper[4658]: I1002 11:51:24.876556 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:24 crc kubenswrapper[4658]: I1002 11:51:24.930977 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-ssh-key\") pod \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " Oct 02 11:51:24 crc kubenswrapper[4658]: I1002 11:51:24.931030 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkhqn\" (UniqueName: \"kubernetes.io/projected/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-kube-api-access-mkhqn\") pod \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " Oct 02 11:51:24 crc kubenswrapper[4658]: I1002 11:51:24.931074 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-inventory\") pod \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\" (UID: \"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a\") " Oct 02 11:51:24 crc kubenswrapper[4658]: I1002 11:51:24.936685 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-kube-api-access-mkhqn" (OuterVolumeSpecName: "kube-api-access-mkhqn") pod "26a7e52d-c3b7-4a7d-ae46-c2f32adb479a" (UID: "26a7e52d-c3b7-4a7d-ae46-c2f32adb479a"). InnerVolumeSpecName "kube-api-access-mkhqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:51:24 crc kubenswrapper[4658]: I1002 11:51:24.979010 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "26a7e52d-c3b7-4a7d-ae46-c2f32adb479a" (UID: "26a7e52d-c3b7-4a7d-ae46-c2f32adb479a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:51:24 crc kubenswrapper[4658]: I1002 11:51:24.985126 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-inventory" (OuterVolumeSpecName: "inventory") pod "26a7e52d-c3b7-4a7d-ae46-c2f32adb479a" (UID: "26a7e52d-c3b7-4a7d-ae46-c2f32adb479a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.034578 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkhqn\" (UniqueName: \"kubernetes.io/projected/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-kube-api-access-mkhqn\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.034625 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.034636 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26a7e52d-c3b7-4a7d-ae46-c2f32adb479a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.344046 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" event={"ID":"26a7e52d-c3b7-4a7d-ae46-c2f32adb479a","Type":"ContainerDied","Data":"7421a2200544aeb0e1c8bfd9e6dccff412aa3f766d952acc37189967f491e5a7"} Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.344106 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7421a2200544aeb0e1c8bfd9e6dccff412aa3f766d952acc37189967f491e5a7" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.344192 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bztkh" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.418808 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8"] Oct 02 11:51:25 crc kubenswrapper[4658]: E1002 11:51:25.419321 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a7e52d-c3b7-4a7d-ae46-c2f32adb479a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.419346 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a7e52d-c3b7-4a7d-ae46-c2f32adb479a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.419638 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a7e52d-c3b7-4a7d-ae46-c2f32adb479a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.420602 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.424864 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.425263 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.426735 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.432367 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.443381 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.443544 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.443802 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tb88\" (UniqueName: \"kubernetes.io/projected/f8637fd5-d51c-4da2-a043-98c8f655f10f-kube-api-access-9tb88\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.452706 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8"] Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.544622 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.544728 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.544838 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tb88\" (UniqueName: \"kubernetes.io/projected/f8637fd5-d51c-4da2-a043-98c8f655f10f-kube-api-access-9tb88\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.548570 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.548835 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.572700 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tb88\" (UniqueName: \"kubernetes.io/projected/f8637fd5-d51c-4da2-a043-98c8f655f10f-kube-api-access-9tb88\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:25 crc kubenswrapper[4658]: I1002 11:51:25.744241 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:26 crc kubenswrapper[4658]: I1002 11:51:26.340383 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8"] Oct 02 11:51:27 crc kubenswrapper[4658]: I1002 11:51:27.380373 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" event={"ID":"f8637fd5-d51c-4da2-a043-98c8f655f10f","Type":"ContainerStarted","Data":"611269057c88288e1cc05048bbc21cf04a404e2585670007a9ae4b5b368ec72d"} Oct 02 11:51:27 crc kubenswrapper[4658]: I1002 11:51:27.380723 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" event={"ID":"f8637fd5-d51c-4da2-a043-98c8f655f10f","Type":"ContainerStarted","Data":"9344d4ec11ae5a91fca2c705e44143c3734ca59ecad708c9b3625634253b020b"} Oct 02 11:51:27 crc kubenswrapper[4658]: I1002 11:51:27.403392 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" podStartSLOduration=1.94324242 podStartE2EDuration="2.403365587s" podCreationTimestamp="2025-10-02 11:51:25 +0000 UTC" firstStartedPulling="2025-10-02 11:51:26.347314663 +0000 UTC m=+1967.238468230" lastFinishedPulling="2025-10-02 11:51:26.80743783 +0000 UTC m=+1967.698591397" observedRunningTime="2025-10-02 11:51:27.397273202 +0000 UTC m=+1968.288426779" watchObservedRunningTime="2025-10-02 11:51:27.403365587 +0000 UTC m=+1968.294519154" Oct 02 11:51:37 crc kubenswrapper[4658]: I1002 11:51:37.509846 4658 generic.go:334] "Generic (PLEG): container finished" podID="f8637fd5-d51c-4da2-a043-98c8f655f10f" containerID="611269057c88288e1cc05048bbc21cf04a404e2585670007a9ae4b5b368ec72d" exitCode=0 Oct 02 11:51:37 crc kubenswrapper[4658]: I1002 11:51:37.509947 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" event={"ID":"f8637fd5-d51c-4da2-a043-98c8f655f10f","Type":"ContainerDied","Data":"611269057c88288e1cc05048bbc21cf04a404e2585670007a9ae4b5b368ec72d"} Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.018230 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.070052 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-ssh-key\") pod \"f8637fd5-d51c-4da2-a043-98c8f655f10f\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.070512 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-inventory\") pod \"f8637fd5-d51c-4da2-a043-98c8f655f10f\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.070567 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tb88\" (UniqueName: \"kubernetes.io/projected/f8637fd5-d51c-4da2-a043-98c8f655f10f-kube-api-access-9tb88\") pod \"f8637fd5-d51c-4da2-a043-98c8f655f10f\" (UID: \"f8637fd5-d51c-4da2-a043-98c8f655f10f\") " Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.074661 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8637fd5-d51c-4da2-a043-98c8f655f10f-kube-api-access-9tb88" (OuterVolumeSpecName: "kube-api-access-9tb88") pod "f8637fd5-d51c-4da2-a043-98c8f655f10f" (UID: "f8637fd5-d51c-4da2-a043-98c8f655f10f"). InnerVolumeSpecName "kube-api-access-9tb88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.096747 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-inventory" (OuterVolumeSpecName: "inventory") pod "f8637fd5-d51c-4da2-a043-98c8f655f10f" (UID: "f8637fd5-d51c-4da2-a043-98c8f655f10f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.112459 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f8637fd5-d51c-4da2-a043-98c8f655f10f" (UID: "f8637fd5-d51c-4da2-a043-98c8f655f10f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.173545 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.173749 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tb88\" (UniqueName: \"kubernetes.io/projected/f8637fd5-d51c-4da2-a043-98c8f655f10f-kube-api-access-9tb88\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.173818 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8637fd5-d51c-4da2-a043-98c8f655f10f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.534783 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" event={"ID":"f8637fd5-d51c-4da2-a043-98c8f655f10f","Type":"ContainerDied","Data":"9344d4ec11ae5a91fca2c705e44143c3734ca59ecad708c9b3625634253b020b"} Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.534818 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9344d4ec11ae5a91fca2c705e44143c3734ca59ecad708c9b3625634253b020b" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.534906 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.636942 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq"] Oct 02 11:51:39 crc kubenswrapper[4658]: E1002 11:51:39.637550 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8637fd5-d51c-4da2-a043-98c8f655f10f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.637577 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8637fd5-d51c-4da2-a043-98c8f655f10f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.637858 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8637fd5-d51c-4da2-a043-98c8f655f10f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.638842 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.643812 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.645622 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.645753 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.645804 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.645821 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.645855 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.645955 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.646065 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.661632 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq"] Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.681801 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.681855 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.681896 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.681933 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.681971 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.681998 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.682018 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.682068 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.682092 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.682111 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.682159 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.682177 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq266\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-kube-api-access-lq266\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.682204 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.682221 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: E1002 11:51:39.707664 4658 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8637fd5_d51c_4da2_a043_98c8f655f10f.slice/crio-9344d4ec11ae5a91fca2c705e44143c3734ca59ecad708c9b3625634253b020b\": RecentStats: unable to find data in memory cache]" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.783975 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784313 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784346 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784370 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784425 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784450 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784468 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784517 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784536 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq266\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-kube-api-access-lq266\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784565 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784586 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784611 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784629 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.784658 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.788373 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.789090 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.789446 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.789624 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.790049 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.790371 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.791357 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.791501 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.792884 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.793933 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.794218 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.795099 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.798745 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.802044 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq266\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-kube-api-access-lq266\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.976013 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:51:39 crc kubenswrapper[4658]: I1002 11:51:39.983694 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:51:40 crc kubenswrapper[4658]: I1002 11:51:40.512521 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq"] Oct 02 11:51:40 crc kubenswrapper[4658]: I1002 11:51:40.544134 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" event={"ID":"8d5900ee-9fca-4a00-8343-b51c6728627d","Type":"ContainerStarted","Data":"6e723ea8cd90e3ecccc365de08b5f02f61101c4949bfe2323b0698c13d3b6c9c"} Oct 02 11:51:41 crc kubenswrapper[4658]: I1002 11:51:41.104699 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:51:41 crc kubenswrapper[4658]: I1002 11:51:41.557243 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" event={"ID":"8d5900ee-9fca-4a00-8343-b51c6728627d","Type":"ContainerStarted","Data":"277a625793ecfd643c3c307fa86f470e3c8a25caa952e108c71ae46455f6d06b"} Oct 02 11:51:41 crc kubenswrapper[4658]: I1002 11:51:41.591421 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" podStartSLOduration=2.012372948 podStartE2EDuration="2.591400755s" podCreationTimestamp="2025-10-02 11:51:39 +0000 UTC" firstStartedPulling="2025-10-02 11:51:40.521936803 +0000 UTC m=+1981.413090380" lastFinishedPulling="2025-10-02 11:51:41.10096462 +0000 UTC m=+1981.992118187" observedRunningTime="2025-10-02 11:51:41.578036248 +0000 UTC m=+1982.469189815" watchObservedRunningTime="2025-10-02 11:51:41.591400755 +0000 UTC m=+1982.482554322" Oct 02 11:51:57 crc kubenswrapper[4658]: I1002 11:51:57.430388 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:51:57 crc kubenswrapper[4658]: I1002 11:51:57.431184 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.609432 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9ncsd"] Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.631269 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.641396 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ncsd"] Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.715586 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-catalog-content\") pod \"redhat-operators-9ncsd\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.716025 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkcpc\" (UniqueName: \"kubernetes.io/projected/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-kube-api-access-qkcpc\") pod \"redhat-operators-9ncsd\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.716200 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-utilities\") pod \"redhat-operators-9ncsd\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.818005 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-utilities\") pod \"redhat-operators-9ncsd\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.818131 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-catalog-content\") pod \"redhat-operators-9ncsd\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.818207 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkcpc\" (UniqueName: \"kubernetes.io/projected/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-kube-api-access-qkcpc\") pod \"redhat-operators-9ncsd\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.818608 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-utilities\") pod \"redhat-operators-9ncsd\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.818810 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-catalog-content\") pod \"redhat-operators-9ncsd\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.837477 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkcpc\" (UniqueName: \"kubernetes.io/projected/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-kube-api-access-qkcpc\") pod \"redhat-operators-9ncsd\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:10 crc kubenswrapper[4658]: I1002 11:52:10.962791 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:11 crc kubenswrapper[4658]: I1002 11:52:11.428766 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ncsd"] Oct 02 11:52:11 crc kubenswrapper[4658]: I1002 11:52:11.883078 4658 generic.go:334] "Generic (PLEG): container finished" podID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerID="1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527" exitCode=0 Oct 02 11:52:11 crc kubenswrapper[4658]: I1002 11:52:11.883182 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ncsd" event={"ID":"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771","Type":"ContainerDied","Data":"1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527"} Oct 02 11:52:11 crc kubenswrapper[4658]: I1002 11:52:11.883490 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ncsd" event={"ID":"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771","Type":"ContainerStarted","Data":"539719ea8ce26f1d7b1c6ba518b05165af28d38cba3bfe316b094d48a7a0d52d"} Oct 02 11:52:13 crc kubenswrapper[4658]: I1002 11:52:13.902029 4658 generic.go:334] "Generic (PLEG): container finished" podID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerID="3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a" exitCode=0 Oct 02 11:52:13 crc kubenswrapper[4658]: I1002 11:52:13.902100 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ncsd" event={"ID":"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771","Type":"ContainerDied","Data":"3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a"} Oct 02 11:52:14 crc kubenswrapper[4658]: I1002 11:52:14.913427 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ncsd" event={"ID":"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771","Type":"ContainerStarted","Data":"74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5"} Oct 02 11:52:14 crc kubenswrapper[4658]: I1002 11:52:14.950973 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9ncsd" podStartSLOduration=2.43469344 podStartE2EDuration="4.950950125s" podCreationTimestamp="2025-10-02 11:52:10 +0000 UTC" firstStartedPulling="2025-10-02 11:52:11.884856646 +0000 UTC m=+2012.776010213" lastFinishedPulling="2025-10-02 11:52:14.401113331 +0000 UTC m=+2015.292266898" observedRunningTime="2025-10-02 11:52:14.930178211 +0000 UTC m=+2015.821331778" watchObservedRunningTime="2025-10-02 11:52:14.950950125 +0000 UTC m=+2015.842103692" Oct 02 11:52:20 crc kubenswrapper[4658]: I1002 11:52:20.964101 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:20 crc kubenswrapper[4658]: I1002 11:52:20.964690 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:21 crc kubenswrapper[4658]: I1002 11:52:21.030908 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:21 crc kubenswrapper[4658]: I1002 11:52:21.092328 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:21 crc kubenswrapper[4658]: I1002 11:52:21.268356 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ncsd"] Oct 02 11:52:21 crc kubenswrapper[4658]: I1002 11:52:21.993216 4658 generic.go:334] "Generic (PLEG): container finished" podID="8d5900ee-9fca-4a00-8343-b51c6728627d" containerID="277a625793ecfd643c3c307fa86f470e3c8a25caa952e108c71ae46455f6d06b" exitCode=0 Oct 02 11:52:21 crc kubenswrapper[4658]: I1002 11:52:21.993317 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" event={"ID":"8d5900ee-9fca-4a00-8343-b51c6728627d","Type":"ContainerDied","Data":"277a625793ecfd643c3c307fa86f470e3c8a25caa952e108c71ae46455f6d06b"} Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.002433 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9ncsd" podUID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerName="registry-server" containerID="cri-o://74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5" gracePeriod=2 Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.500969 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.510616 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.663729 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ssh-key\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.663843 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-repo-setup-combined-ca-bundle\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.663908 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-nova-combined-ca-bundle\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.663935 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ovn-combined-ca-bundle\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.663960 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-libvirt-combined-ca-bundle\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664000 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664038 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664064 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664102 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkcpc\" (UniqueName: \"kubernetes.io/projected/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-kube-api-access-qkcpc\") pod \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664134 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-bootstrap-combined-ca-bundle\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664192 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-telemetry-combined-ca-bundle\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664216 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-utilities\") pod \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664239 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-catalog-content\") pod \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\" (UID: \"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664261 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-inventory\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664288 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-neutron-metadata-combined-ca-bundle\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664346 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.664404 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq266\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-kube-api-access-lq266\") pod \"8d5900ee-9fca-4a00-8343-b51c6728627d\" (UID: \"8d5900ee-9fca-4a00-8343-b51c6728627d\") " Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.665968 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-utilities" (OuterVolumeSpecName: "utilities") pod "032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" (UID: "032f76a7-f4eb-4f08-84e9-9f9d4b8ae771"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.671641 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-kube-api-access-qkcpc" (OuterVolumeSpecName: "kube-api-access-qkcpc") pod "032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" (UID: "032f76a7-f4eb-4f08-84e9-9f9d4b8ae771"). InnerVolumeSpecName "kube-api-access-qkcpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.671811 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.671920 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.672672 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.673370 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-kube-api-access-lq266" (OuterVolumeSpecName: "kube-api-access-lq266") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "kube-api-access-lq266". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.673505 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.674494 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.675983 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.676536 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.676909 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.676998 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.677227 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.677497 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.705192 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.719052 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-inventory" (OuterVolumeSpecName: "inventory") pod "8d5900ee-9fca-4a00-8343-b51c6728627d" (UID: "8d5900ee-9fca-4a00-8343-b51c6728627d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.767923 4658 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.767957 4658 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.767967 4658 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.767977 4658 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.767986 4658 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.767996 4658 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768005 4658 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768016 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkcpc\" (UniqueName: \"kubernetes.io/projected/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-kube-api-access-qkcpc\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768027 4658 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768036 4658 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768044 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768052 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768061 4658 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768070 4658 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768079 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq266\" (UniqueName: \"kubernetes.io/projected/8d5900ee-9fca-4a00-8343-b51c6728627d-kube-api-access-lq266\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:23 crc kubenswrapper[4658]: I1002 11:52:23.768088 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d5900ee-9fca-4a00-8343-b51c6728627d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.024576 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" event={"ID":"8d5900ee-9fca-4a00-8343-b51c6728627d","Type":"ContainerDied","Data":"6e723ea8cd90e3ecccc365de08b5f02f61101c4949bfe2323b0698c13d3b6c9c"} Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.024623 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e723ea8cd90e3ecccc365de08b5f02f61101c4949bfe2323b0698c13d3b6c9c" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.024699 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.035528 4658 generic.go:334] "Generic (PLEG): container finished" podID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerID="74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5" exitCode=0 Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.035577 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ncsd" event={"ID":"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771","Type":"ContainerDied","Data":"74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5"} Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.035608 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ncsd" event={"ID":"032f76a7-f4eb-4f08-84e9-9f9d4b8ae771","Type":"ContainerDied","Data":"539719ea8ce26f1d7b1c6ba518b05165af28d38cba3bfe316b094d48a7a0d52d"} Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.035627 4658 scope.go:117] "RemoveContainer" containerID="74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.035779 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ncsd" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.070376 4658 scope.go:117] "RemoveContainer" containerID="3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.126494 4658 scope.go:117] "RemoveContainer" containerID="1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.190382 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw"] Oct 02 11:52:24 crc kubenswrapper[4658]: E1002 11:52:24.190960 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerName="extract-content" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.190975 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerName="extract-content" Oct 02 11:52:24 crc kubenswrapper[4658]: E1002 11:52:24.190991 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerName="extract-utilities" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.190998 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerName="extract-utilities" Oct 02 11:52:24 crc kubenswrapper[4658]: E1002 11:52:24.191015 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerName="registry-server" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.191021 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerName="registry-server" Oct 02 11:52:24 crc kubenswrapper[4658]: E1002 11:52:24.191042 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5900ee-9fca-4a00-8343-b51c6728627d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.191049 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5900ee-9fca-4a00-8343-b51c6728627d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.191234 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d5900ee-9fca-4a00-8343-b51c6728627d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.191251 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" containerName="registry-server" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.191916 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.195225 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.195458 4658 scope.go:117] "RemoveContainer" containerID="74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5" Oct 02 11:52:24 crc kubenswrapper[4658]: E1002 11:52:24.196003 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5\": container with ID starting with 74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5 not found: ID does not exist" containerID="74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.196035 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5"} err="failed to get container status \"74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5\": rpc error: code = NotFound desc = could not find container \"74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5\": container with ID starting with 74c522ecf00e63fb327eae5e8860fcd7b8926042ce4db8f23b85736344f53fd5 not found: ID does not exist" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.196054 4658 scope.go:117] "RemoveContainer" containerID="3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a" Oct 02 11:52:24 crc kubenswrapper[4658]: E1002 11:52:24.196445 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a\": container with ID starting with 3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a not found: ID does not exist" containerID="3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.196467 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a"} err="failed to get container status \"3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a\": rpc error: code = NotFound desc = could not find container \"3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a\": container with ID starting with 3d979f61d66c394f99100329e13e908c70e11b96f418fc262a8c246d7066f66a not found: ID does not exist" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.196480 4658 scope.go:117] "RemoveContainer" containerID="1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527" Oct 02 11:52:24 crc kubenswrapper[4658]: E1002 11:52:24.196651 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527\": container with ID starting with 1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527 not found: ID does not exist" containerID="1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.196670 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527"} err="failed to get container status \"1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527\": rpc error: code = NotFound desc = could not find container \"1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527\": container with ID starting with 1b486e8ed2a10e13bf25c123b0349cdc69932edf0561ce84f52fa22a306f3527 not found: ID does not exist" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.197033 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.197133 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.197219 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.197405 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.200550 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw"] Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.377767 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7b7s\" (UniqueName: \"kubernetes.io/projected/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-kube-api-access-m7b7s\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.377897 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.377936 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.378204 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.378255 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.476001 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" (UID: "032f76a7-f4eb-4f08-84e9-9f9d4b8ae771"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.479923 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.479991 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.480065 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.480085 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.480105 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7b7s\" (UniqueName: \"kubernetes.io/projected/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-kube-api-access-m7b7s\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.480176 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.481285 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.485799 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.486247 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.488457 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.498365 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7b7s\" (UniqueName: \"kubernetes.io/projected/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-kube-api-access-m7b7s\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gkwgw\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.553026 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.696053 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ncsd"] Oct 02 11:52:24 crc kubenswrapper[4658]: I1002 11:52:24.713529 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9ncsd"] Oct 02 11:52:25 crc kubenswrapper[4658]: I1002 11:52:25.116547 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw"] Oct 02 11:52:25 crc kubenswrapper[4658]: W1002 11:52:25.116640 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e5fa727_3d1f_4293_a2c2_33ba1f10ae2b.slice/crio-449a2d34efd2aec6a23710c158662b450d9ab6d51ff805beb3b3ebf8d0dfc82d WatchSource:0}: Error finding container 449a2d34efd2aec6a23710c158662b450d9ab6d51ff805beb3b3ebf8d0dfc82d: Status 404 returned error can't find the container with id 449a2d34efd2aec6a23710c158662b450d9ab6d51ff805beb3b3ebf8d0dfc82d Oct 02 11:52:25 crc kubenswrapper[4658]: I1002 11:52:25.965937 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032f76a7-f4eb-4f08-84e9-9f9d4b8ae771" path="/var/lib/kubelet/pods/032f76a7-f4eb-4f08-84e9-9f9d4b8ae771/volumes" Oct 02 11:52:26 crc kubenswrapper[4658]: I1002 11:52:26.057189 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" event={"ID":"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b","Type":"ContainerStarted","Data":"51e045d3041020ee63f2905dede81f2e7555d7efc6291dca34ca1cb687a0ca9a"} Oct 02 11:52:26 crc kubenswrapper[4658]: I1002 11:52:26.057247 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" event={"ID":"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b","Type":"ContainerStarted","Data":"449a2d34efd2aec6a23710c158662b450d9ab6d51ff805beb3b3ebf8d0dfc82d"} Oct 02 11:52:26 crc kubenswrapper[4658]: I1002 11:52:26.080156 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" podStartSLOduration=1.4116872919999999 podStartE2EDuration="2.080138387s" podCreationTimestamp="2025-10-02 11:52:24 +0000 UTC" firstStartedPulling="2025-10-02 11:52:25.120058141 +0000 UTC m=+2026.011211718" lastFinishedPulling="2025-10-02 11:52:25.788509206 +0000 UTC m=+2026.679662813" observedRunningTime="2025-10-02 11:52:26.074493456 +0000 UTC m=+2026.965647033" watchObservedRunningTime="2025-10-02 11:52:26.080138387 +0000 UTC m=+2026.971291954" Oct 02 11:52:27 crc kubenswrapper[4658]: I1002 11:52:27.429798 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:52:27 crc kubenswrapper[4658]: I1002 11:52:27.430179 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:52:57 crc kubenswrapper[4658]: I1002 11:52:57.430259 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:52:57 crc kubenswrapper[4658]: I1002 11:52:57.431046 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:52:57 crc kubenswrapper[4658]: I1002 11:52:57.431110 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:52:57 crc kubenswrapper[4658]: I1002 11:52:57.432262 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a47b4f1ee22e57466ef65cda1906555215a872b918f678a1cf99fade8b5c597"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:52:57 crc kubenswrapper[4658]: I1002 11:52:57.432381 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://7a47b4f1ee22e57466ef65cda1906555215a872b918f678a1cf99fade8b5c597" gracePeriod=600 Oct 02 11:52:58 crc kubenswrapper[4658]: I1002 11:52:58.404456 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="7a47b4f1ee22e57466ef65cda1906555215a872b918f678a1cf99fade8b5c597" exitCode=0 Oct 02 11:52:58 crc kubenswrapper[4658]: I1002 11:52:58.404638 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"7a47b4f1ee22e57466ef65cda1906555215a872b918f678a1cf99fade8b5c597"} Oct 02 11:52:58 crc kubenswrapper[4658]: I1002 11:52:58.404747 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97"} Oct 02 11:52:58 crc kubenswrapper[4658]: I1002 11:52:58.404768 4658 scope.go:117] "RemoveContainer" containerID="7a795a2babba39b48463358f20445f060be3f19165c6038c4d5706656dc0a48f" Oct 02 11:53:29 crc kubenswrapper[4658]: I1002 11:53:29.840416 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5cfx5"] Oct 02 11:53:29 crc kubenswrapper[4658]: I1002 11:53:29.845203 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:29 crc kubenswrapper[4658]: I1002 11:53:29.855772 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cfx5"] Oct 02 11:53:29 crc kubenswrapper[4658]: I1002 11:53:29.901150 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-catalog-content\") pod \"certified-operators-5cfx5\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:29 crc kubenswrapper[4658]: I1002 11:53:29.901394 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgz42\" (UniqueName: \"kubernetes.io/projected/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-kube-api-access-jgz42\") pod \"certified-operators-5cfx5\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:29 crc kubenswrapper[4658]: I1002 11:53:29.901577 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-utilities\") pod \"certified-operators-5cfx5\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:30 crc kubenswrapper[4658]: I1002 11:53:30.002788 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-catalog-content\") pod \"certified-operators-5cfx5\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:30 crc kubenswrapper[4658]: I1002 11:53:30.002878 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgz42\" (UniqueName: \"kubernetes.io/projected/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-kube-api-access-jgz42\") pod \"certified-operators-5cfx5\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:30 crc kubenswrapper[4658]: I1002 11:53:30.002977 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-utilities\") pod \"certified-operators-5cfx5\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:30 crc kubenswrapper[4658]: I1002 11:53:30.003400 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-utilities\") pod \"certified-operators-5cfx5\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:30 crc kubenswrapper[4658]: I1002 11:53:30.003833 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-catalog-content\") pod \"certified-operators-5cfx5\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:30 crc kubenswrapper[4658]: I1002 11:53:30.024642 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgz42\" (UniqueName: \"kubernetes.io/projected/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-kube-api-access-jgz42\") pod \"certified-operators-5cfx5\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:30 crc kubenswrapper[4658]: I1002 11:53:30.171930 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:30 crc kubenswrapper[4658]: I1002 11:53:30.683666 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cfx5"] Oct 02 11:53:30 crc kubenswrapper[4658]: I1002 11:53:30.744696 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cfx5" event={"ID":"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04","Type":"ContainerStarted","Data":"1120ca82d8bdab1a5a254ac8d82383bb0892eb4eeb24f5993846a8153d9993b2"} Oct 02 11:53:31 crc kubenswrapper[4658]: I1002 11:53:31.758945 4658 generic.go:334] "Generic (PLEG): container finished" podID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerID="bf436456450b87d3ad415542f4e510a51ca4874bb528d2bbe9b5c4d595208f77" exitCode=0 Oct 02 11:53:31 crc kubenswrapper[4658]: I1002 11:53:31.759004 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cfx5" event={"ID":"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04","Type":"ContainerDied","Data":"bf436456450b87d3ad415542f4e510a51ca4874bb528d2bbe9b5c4d595208f77"} Oct 02 11:53:33 crc kubenswrapper[4658]: I1002 11:53:33.788189 4658 generic.go:334] "Generic (PLEG): container finished" podID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerID="8c6b132cbfb3f52af756472b2c7a2c4f91cce0dfad98bce88f0ef9ff31440d9b" exitCode=0 Oct 02 11:53:33 crc kubenswrapper[4658]: I1002 11:53:33.788342 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cfx5" event={"ID":"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04","Type":"ContainerDied","Data":"8c6b132cbfb3f52af756472b2c7a2c4f91cce0dfad98bce88f0ef9ff31440d9b"} Oct 02 11:53:33 crc kubenswrapper[4658]: I1002 11:53:33.791731 4658 generic.go:334] "Generic (PLEG): container finished" podID="3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b" containerID="51e045d3041020ee63f2905dede81f2e7555d7efc6291dca34ca1cb687a0ca9a" exitCode=0 Oct 02 11:53:33 crc kubenswrapper[4658]: I1002 11:53:33.791778 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" event={"ID":"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b","Type":"ContainerDied","Data":"51e045d3041020ee63f2905dede81f2e7555d7efc6291dca34ca1cb687a0ca9a"} Oct 02 11:53:34 crc kubenswrapper[4658]: I1002 11:53:34.805445 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cfx5" event={"ID":"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04","Type":"ContainerStarted","Data":"9a7226b06c4f055f9ab622a3fd03cce6126bdcda9b36abf785a5fd45e055e593"} Oct 02 11:53:34 crc kubenswrapper[4658]: I1002 11:53:34.829530 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5cfx5" podStartSLOduration=3.29624365 podStartE2EDuration="5.829507468s" podCreationTimestamp="2025-10-02 11:53:29 +0000 UTC" firstStartedPulling="2025-10-02 11:53:31.761672563 +0000 UTC m=+2092.652826130" lastFinishedPulling="2025-10-02 11:53:34.294936371 +0000 UTC m=+2095.186089948" observedRunningTime="2025-10-02 11:53:34.825089096 +0000 UTC m=+2095.716242663" watchObservedRunningTime="2025-10-02 11:53:34.829507468 +0000 UTC m=+2095.720661035" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.267947 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.317621 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-inventory\") pod \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.318124 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7b7s\" (UniqueName: \"kubernetes.io/projected/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-kube-api-access-m7b7s\") pod \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.318163 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovncontroller-config-0\") pod \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.318232 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ssh-key\") pod \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.318367 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovn-combined-ca-bundle\") pod \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\" (UID: \"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b\") " Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.333953 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-kube-api-access-m7b7s" (OuterVolumeSpecName: "kube-api-access-m7b7s") pod "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b" (UID: "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b"). InnerVolumeSpecName "kube-api-access-m7b7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.334124 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b" (UID: "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.343165 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b" (UID: "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.348516 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-inventory" (OuterVolumeSpecName: "inventory") pod "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b" (UID: "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.354392 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b" (UID: "3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.421328 4658 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.421365 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.421374 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7b7s\" (UniqueName: \"kubernetes.io/projected/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-kube-api-access-m7b7s\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.421384 4658 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.421393 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.822960 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.824439 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gkwgw" event={"ID":"3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b","Type":"ContainerDied","Data":"449a2d34efd2aec6a23710c158662b450d9ab6d51ff805beb3b3ebf8d0dfc82d"} Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.824474 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="449a2d34efd2aec6a23710c158662b450d9ab6d51ff805beb3b3ebf8d0dfc82d" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.934795 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb"] Oct 02 11:53:35 crc kubenswrapper[4658]: E1002 11:53:35.935226 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.935244 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.935552 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.936246 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.938537 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.939077 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.939795 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.939812 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.939899 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.944017 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:53:35 crc kubenswrapper[4658]: I1002 11:53:35.983065 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb"] Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.032643 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.032735 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.032768 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.032934 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.033921 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfnhm\" (UniqueName: \"kubernetes.io/projected/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-kube-api-access-tfnhm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.034035 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.136473 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.136893 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.136924 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.137042 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.137104 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfnhm\" (UniqueName: \"kubernetes.io/projected/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-kube-api-access-tfnhm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.137191 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.141545 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.141595 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.142129 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.144231 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.145692 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.157479 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfnhm\" (UniqueName: \"kubernetes.io/projected/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-kube-api-access-tfnhm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.274039 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.774465 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb"] Oct 02 11:53:36 crc kubenswrapper[4658]: I1002 11:53:36.832588 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" event={"ID":"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f","Type":"ContainerStarted","Data":"a327d0ce0a9b5153d2e60dd4d16c214151ab1e63529de9aea126f0c5c3ea10b2"} Oct 02 11:53:37 crc kubenswrapper[4658]: I1002 11:53:37.847059 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" event={"ID":"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f","Type":"ContainerStarted","Data":"15ad9d4fb8009cbb72932c7fe8c3bde1cb12739714a937bb9fbb98de4929fbc1"} Oct 02 11:53:37 crc kubenswrapper[4658]: I1002 11:53:37.867082 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" podStartSLOduration=2.394590412 podStartE2EDuration="2.867061033s" podCreationTimestamp="2025-10-02 11:53:35 +0000 UTC" firstStartedPulling="2025-10-02 11:53:36.780311709 +0000 UTC m=+2097.671465276" lastFinishedPulling="2025-10-02 11:53:37.25278233 +0000 UTC m=+2098.143935897" observedRunningTime="2025-10-02 11:53:37.863755048 +0000 UTC m=+2098.754908615" watchObservedRunningTime="2025-10-02 11:53:37.867061033 +0000 UTC m=+2098.758214600" Oct 02 11:53:40 crc kubenswrapper[4658]: I1002 11:53:40.179484 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:40 crc kubenswrapper[4658]: I1002 11:53:40.180064 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:40 crc kubenswrapper[4658]: I1002 11:53:40.234701 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:40 crc kubenswrapper[4658]: I1002 11:53:40.947648 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:48 crc kubenswrapper[4658]: I1002 11:53:48.029580 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5cfx5"] Oct 02 11:53:48 crc kubenswrapper[4658]: I1002 11:53:48.030673 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5cfx5" podUID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerName="registry-server" containerID="cri-o://9a7226b06c4f055f9ab622a3fd03cce6126bdcda9b36abf785a5fd45e055e593" gracePeriod=2 Oct 02 11:53:48 crc kubenswrapper[4658]: I1002 11:53:48.975314 4658 generic.go:334] "Generic (PLEG): container finished" podID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerID="9a7226b06c4f055f9ab622a3fd03cce6126bdcda9b36abf785a5fd45e055e593" exitCode=0 Oct 02 11:53:48 crc kubenswrapper[4658]: I1002 11:53:48.975358 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cfx5" event={"ID":"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04","Type":"ContainerDied","Data":"9a7226b06c4f055f9ab622a3fd03cce6126bdcda9b36abf785a5fd45e055e593"} Oct 02 11:53:48 crc kubenswrapper[4658]: I1002 11:53:48.975738 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cfx5" event={"ID":"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04","Type":"ContainerDied","Data":"1120ca82d8bdab1a5a254ac8d82383bb0892eb4eeb24f5993846a8153d9993b2"} Oct 02 11:53:48 crc kubenswrapper[4658]: I1002 11:53:48.975764 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1120ca82d8bdab1a5a254ac8d82383bb0892eb4eeb24f5993846a8153d9993b2" Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.030014 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.122789 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-utilities\") pod \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.122853 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-catalog-content\") pod \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.123040 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgz42\" (UniqueName: \"kubernetes.io/projected/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-kube-api-access-jgz42\") pod \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\" (UID: \"d1ce51bd-1fdd-4346-bcbc-0592e0a45b04\") " Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.124254 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-utilities" (OuterVolumeSpecName: "utilities") pod "d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" (UID: "d1ce51bd-1fdd-4346-bcbc-0592e0a45b04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.129071 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-kube-api-access-jgz42" (OuterVolumeSpecName: "kube-api-access-jgz42") pod "d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" (UID: "d1ce51bd-1fdd-4346-bcbc-0592e0a45b04"). InnerVolumeSpecName "kube-api-access-jgz42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.167639 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" (UID: "d1ce51bd-1fdd-4346-bcbc-0592e0a45b04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.225993 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.226046 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgz42\" (UniqueName: \"kubernetes.io/projected/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-kube-api-access-jgz42\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.226070 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:49 crc kubenswrapper[4658]: I1002 11:53:49.983895 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cfx5" Oct 02 11:53:50 crc kubenswrapper[4658]: I1002 11:53:50.026603 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5cfx5"] Oct 02 11:53:50 crc kubenswrapper[4658]: I1002 11:53:50.040312 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5cfx5"] Oct 02 11:53:51 crc kubenswrapper[4658]: I1002 11:53:51.977146 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" path="/var/lib/kubelet/pods/d1ce51bd-1fdd-4346-bcbc-0592e0a45b04/volumes" Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.772790 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9w2k6"] Oct 02 11:54:02 crc kubenswrapper[4658]: E1002 11:54:02.773924 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerName="registry-server" Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.773943 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerName="registry-server" Oct 02 11:54:02 crc kubenswrapper[4658]: E1002 11:54:02.773992 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerName="extract-content" Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.774001 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerName="extract-content" Oct 02 11:54:02 crc kubenswrapper[4658]: E1002 11:54:02.774020 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerName="extract-utilities" Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.774028 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerName="extract-utilities" Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.774259 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ce51bd-1fdd-4346-bcbc-0592e0a45b04" containerName="registry-server" Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.775867 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.791807 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w2k6"] Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.906510 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-catalog-content\") pod \"community-operators-9w2k6\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.907060 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-utilities\") pod \"community-operators-9w2k6\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:02 crc kubenswrapper[4658]: I1002 11:54:02.907248 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xltss\" (UniqueName: \"kubernetes.io/projected/d9a9aa43-b2c9-4339-856d-fd778252971e-kube-api-access-xltss\") pod \"community-operators-9w2k6\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:03 crc kubenswrapper[4658]: I1002 11:54:03.008777 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-utilities\") pod \"community-operators-9w2k6\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:03 crc kubenswrapper[4658]: I1002 11:54:03.008889 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xltss\" (UniqueName: \"kubernetes.io/projected/d9a9aa43-b2c9-4339-856d-fd778252971e-kube-api-access-xltss\") pod \"community-operators-9w2k6\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:03 crc kubenswrapper[4658]: I1002 11:54:03.008927 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-catalog-content\") pod \"community-operators-9w2k6\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:03 crc kubenswrapper[4658]: I1002 11:54:03.009454 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-utilities\") pod \"community-operators-9w2k6\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:03 crc kubenswrapper[4658]: I1002 11:54:03.009507 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-catalog-content\") pod \"community-operators-9w2k6\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:03 crc kubenswrapper[4658]: I1002 11:54:03.028963 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xltss\" (UniqueName: \"kubernetes.io/projected/d9a9aa43-b2c9-4339-856d-fd778252971e-kube-api-access-xltss\") pod \"community-operators-9w2k6\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:03 crc kubenswrapper[4658]: I1002 11:54:03.105662 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:03 crc kubenswrapper[4658]: I1002 11:54:03.671599 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w2k6"] Oct 02 11:54:03 crc kubenswrapper[4658]: W1002 11:54:03.673791 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9a9aa43_b2c9_4339_856d_fd778252971e.slice/crio-fd9342e05613828188113b9e0ea89ff9eaceede63bb1a4d6c1659665ff3805b7 WatchSource:0}: Error finding container fd9342e05613828188113b9e0ea89ff9eaceede63bb1a4d6c1659665ff3805b7: Status 404 returned error can't find the container with id fd9342e05613828188113b9e0ea89ff9eaceede63bb1a4d6c1659665ff3805b7 Oct 02 11:54:04 crc kubenswrapper[4658]: I1002 11:54:04.200721 4658 generic.go:334] "Generic (PLEG): container finished" podID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerID="ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f" exitCode=0 Oct 02 11:54:04 crc kubenswrapper[4658]: I1002 11:54:04.200781 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w2k6" event={"ID":"d9a9aa43-b2c9-4339-856d-fd778252971e","Type":"ContainerDied","Data":"ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f"} Oct 02 11:54:04 crc kubenswrapper[4658]: I1002 11:54:04.201008 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w2k6" event={"ID":"d9a9aa43-b2c9-4339-856d-fd778252971e","Type":"ContainerStarted","Data":"fd9342e05613828188113b9e0ea89ff9eaceede63bb1a4d6c1659665ff3805b7"} Oct 02 11:54:05 crc kubenswrapper[4658]: I1002 11:54:05.213694 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w2k6" event={"ID":"d9a9aa43-b2c9-4339-856d-fd778252971e","Type":"ContainerStarted","Data":"fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b"} Oct 02 11:54:06 crc kubenswrapper[4658]: I1002 11:54:06.225133 4658 generic.go:334] "Generic (PLEG): container finished" podID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerID="fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b" exitCode=0 Oct 02 11:54:06 crc kubenswrapper[4658]: I1002 11:54:06.225229 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w2k6" event={"ID":"d9a9aa43-b2c9-4339-856d-fd778252971e","Type":"ContainerDied","Data":"fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b"} Oct 02 11:54:07 crc kubenswrapper[4658]: I1002 11:54:07.241711 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w2k6" event={"ID":"d9a9aa43-b2c9-4339-856d-fd778252971e","Type":"ContainerStarted","Data":"d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8"} Oct 02 11:54:07 crc kubenswrapper[4658]: I1002 11:54:07.268675 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9w2k6" podStartSLOduration=2.798626182 podStartE2EDuration="5.268641369s" podCreationTimestamp="2025-10-02 11:54:02 +0000 UTC" firstStartedPulling="2025-10-02 11:54:04.203266993 +0000 UTC m=+2125.094420570" lastFinishedPulling="2025-10-02 11:54:06.67328218 +0000 UTC m=+2127.564435757" observedRunningTime="2025-10-02 11:54:07.260776118 +0000 UTC m=+2128.151929685" watchObservedRunningTime="2025-10-02 11:54:07.268641369 +0000 UTC m=+2128.159794996" Oct 02 11:54:13 crc kubenswrapper[4658]: I1002 11:54:13.106795 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:13 crc kubenswrapper[4658]: I1002 11:54:13.107343 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:13 crc kubenswrapper[4658]: I1002 11:54:13.180089 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:13 crc kubenswrapper[4658]: I1002 11:54:13.361539 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:13 crc kubenswrapper[4658]: I1002 11:54:13.430733 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w2k6"] Oct 02 11:54:15 crc kubenswrapper[4658]: I1002 11:54:15.337454 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9w2k6" podUID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerName="registry-server" containerID="cri-o://d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8" gracePeriod=2 Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.300562 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.350692 4658 generic.go:334] "Generic (PLEG): container finished" podID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerID="d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8" exitCode=0 Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.350740 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w2k6" event={"ID":"d9a9aa43-b2c9-4339-856d-fd778252971e","Type":"ContainerDied","Data":"d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8"} Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.350812 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w2k6" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.351057 4658 scope.go:117] "RemoveContainer" containerID="d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.351045 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w2k6" event={"ID":"d9a9aa43-b2c9-4339-856d-fd778252971e","Type":"ContainerDied","Data":"fd9342e05613828188113b9e0ea89ff9eaceede63bb1a4d6c1659665ff3805b7"} Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.379822 4658 scope.go:117] "RemoveContainer" containerID="fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.404439 4658 scope.go:117] "RemoveContainer" containerID="ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.409644 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-catalog-content\") pod \"d9a9aa43-b2c9-4339-856d-fd778252971e\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.409775 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xltss\" (UniqueName: \"kubernetes.io/projected/d9a9aa43-b2c9-4339-856d-fd778252971e-kube-api-access-xltss\") pod \"d9a9aa43-b2c9-4339-856d-fd778252971e\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.409915 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-utilities\") pod \"d9a9aa43-b2c9-4339-856d-fd778252971e\" (UID: \"d9a9aa43-b2c9-4339-856d-fd778252971e\") " Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.411142 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-utilities" (OuterVolumeSpecName: "utilities") pod "d9a9aa43-b2c9-4339-856d-fd778252971e" (UID: "d9a9aa43-b2c9-4339-856d-fd778252971e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.415880 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a9aa43-b2c9-4339-856d-fd778252971e-kube-api-access-xltss" (OuterVolumeSpecName: "kube-api-access-xltss") pod "d9a9aa43-b2c9-4339-856d-fd778252971e" (UID: "d9a9aa43-b2c9-4339-856d-fd778252971e"). InnerVolumeSpecName "kube-api-access-xltss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.465859 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9a9aa43-b2c9-4339-856d-fd778252971e" (UID: "d9a9aa43-b2c9-4339-856d-fd778252971e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.489891 4658 scope.go:117] "RemoveContainer" containerID="d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8" Oct 02 11:54:16 crc kubenswrapper[4658]: E1002 11:54:16.490665 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8\": container with ID starting with d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8 not found: ID does not exist" containerID="d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.490717 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8"} err="failed to get container status \"d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8\": rpc error: code = NotFound desc = could not find container \"d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8\": container with ID starting with d817062c586a3a088e421f3f0b500aa91c1fa64b485288bff72b900b002e61e8 not found: ID does not exist" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.490744 4658 scope.go:117] "RemoveContainer" containerID="fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b" Oct 02 11:54:16 crc kubenswrapper[4658]: E1002 11:54:16.491042 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b\": container with ID starting with fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b not found: ID does not exist" containerID="fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.491073 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b"} err="failed to get container status \"fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b\": rpc error: code = NotFound desc = could not find container \"fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b\": container with ID starting with fe094ebb999f867ee3e77ddbec4dd8b98773de9a2a73aeaf8d9182ae0051769b not found: ID does not exist" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.491095 4658 scope.go:117] "RemoveContainer" containerID="ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f" Oct 02 11:54:16 crc kubenswrapper[4658]: E1002 11:54:16.491450 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f\": container with ID starting with ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f not found: ID does not exist" containerID="ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.491478 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f"} err="failed to get container status \"ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f\": rpc error: code = NotFound desc = could not find container \"ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f\": container with ID starting with ecf9005744e50ba0d60a33582008fd60196b507f04de2513a24df9cef46adc3f not found: ID does not exist" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.512398 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.512428 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a9aa43-b2c9-4339-856d-fd778252971e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.512441 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xltss\" (UniqueName: \"kubernetes.io/projected/d9a9aa43-b2c9-4339-856d-fd778252971e-kube-api-access-xltss\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.694107 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w2k6"] Oct 02 11:54:16 crc kubenswrapper[4658]: I1002 11:54:16.704236 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9w2k6"] Oct 02 11:54:17 crc kubenswrapper[4658]: I1002 11:54:17.963156 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a9aa43-b2c9-4339-856d-fd778252971e" path="/var/lib/kubelet/pods/d9a9aa43-b2c9-4339-856d-fd778252971e/volumes" Oct 02 11:54:28 crc kubenswrapper[4658]: I1002 11:54:28.503186 4658 generic.go:334] "Generic (PLEG): container finished" podID="ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" containerID="15ad9d4fb8009cbb72932c7fe8c3bde1cb12739714a937bb9fbb98de4929fbc1" exitCode=0 Oct 02 11:54:28 crc kubenswrapper[4658]: I1002 11:54:28.503320 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" event={"ID":"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f","Type":"ContainerDied","Data":"15ad9d4fb8009cbb72932c7fe8c3bde1cb12739714a937bb9fbb98de4929fbc1"} Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.033220 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.102987 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.103668 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-nova-metadata-neutron-config-0\") pod \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.103895 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-inventory\") pod \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.104627 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfnhm\" (UniqueName: \"kubernetes.io/projected/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-kube-api-access-tfnhm\") pod \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.105330 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-metadata-combined-ca-bundle\") pod \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.105632 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-ssh-key\") pod \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\" (UID: \"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f\") " Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.111517 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-kube-api-access-tfnhm" (OuterVolumeSpecName: "kube-api-access-tfnhm") pod "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" (UID: "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f"). InnerVolumeSpecName "kube-api-access-tfnhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.113188 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" (UID: "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.138482 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" (UID: "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.149074 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" (UID: "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.164015 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" (UID: "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.182717 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-inventory" (OuterVolumeSpecName: "inventory") pod "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" (UID: "ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.210425 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.210473 4658 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.210489 4658 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.210502 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.210517 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfnhm\" (UniqueName: \"kubernetes.io/projected/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-kube-api-access-tfnhm\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.210528 4658 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.531661 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" event={"ID":"ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f","Type":"ContainerDied","Data":"a327d0ce0a9b5153d2e60dd4d16c214151ab1e63529de9aea126f0c5c3ea10b2"} Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.532002 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a327d0ce0a9b5153d2e60dd4d16c214151ab1e63529de9aea126f0c5c3ea10b2" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.531782 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.707527 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq"] Oct 02 11:54:30 crc kubenswrapper[4658]: E1002 11:54:30.707952 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerName="extract-content" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.707971 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerName="extract-content" Oct 02 11:54:30 crc kubenswrapper[4658]: E1002 11:54:30.707984 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerName="registry-server" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.707992 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerName="registry-server" Oct 02 11:54:30 crc kubenswrapper[4658]: E1002 11:54:30.708011 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.708021 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 11:54:30 crc kubenswrapper[4658]: E1002 11:54:30.708035 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerName="extract-utilities" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.708041 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerName="extract-utilities" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.708239 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.708279 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a9aa43-b2c9-4339-856d-fd778252971e" containerName="registry-server" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.708928 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.713466 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.713771 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.714076 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.714477 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.714548 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.727209 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq"] Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.821010 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.821052 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.821080 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.821251 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmgv9\" (UniqueName: \"kubernetes.io/projected/074ed90b-9bda-4d7f-819d-41f3e7569ac4-kube-api-access-cmgv9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.821540 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.923389 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.923439 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.923467 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.923522 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmgv9\" (UniqueName: \"kubernetes.io/projected/074ed90b-9bda-4d7f-819d-41f3e7569ac4-kube-api-access-cmgv9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.923611 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.929335 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.929686 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.929688 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.937109 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:30 crc kubenswrapper[4658]: I1002 11:54:30.942412 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmgv9\" (UniqueName: \"kubernetes.io/projected/074ed90b-9bda-4d7f-819d-41f3e7569ac4-kube-api-access-cmgv9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-59fjq\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:31 crc kubenswrapper[4658]: I1002 11:54:31.047731 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:54:31 crc kubenswrapper[4658]: I1002 11:54:31.635267 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq"] Oct 02 11:54:32 crc kubenswrapper[4658]: I1002 11:54:32.568709 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" event={"ID":"074ed90b-9bda-4d7f-819d-41f3e7569ac4","Type":"ContainerStarted","Data":"3fa1485ed27171c87028dc8d10bcd4f7f7f055410867ba8fbe0c9537f0e6aa5b"} Oct 02 11:54:33 crc kubenswrapper[4658]: I1002 11:54:33.580316 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" event={"ID":"074ed90b-9bda-4d7f-819d-41f3e7569ac4","Type":"ContainerStarted","Data":"5b439a052836874521011271246ec9370fe73c72eb786a5deff3e7588c0a69e7"} Oct 02 11:54:33 crc kubenswrapper[4658]: I1002 11:54:33.613457 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" podStartSLOduration=2.748968264 podStartE2EDuration="3.613435924s" podCreationTimestamp="2025-10-02 11:54:30 +0000 UTC" firstStartedPulling="2025-10-02 11:54:31.63689015 +0000 UTC m=+2152.528043717" lastFinishedPulling="2025-10-02 11:54:32.50135777 +0000 UTC m=+2153.392511377" observedRunningTime="2025-10-02 11:54:33.60796629 +0000 UTC m=+2154.499119877" watchObservedRunningTime="2025-10-02 11:54:33.613435924 +0000 UTC m=+2154.504589491" Oct 02 11:55:27 crc kubenswrapper[4658]: I1002 11:55:27.429685 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:55:27 crc kubenswrapper[4658]: I1002 11:55:27.430472 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:55:37 crc kubenswrapper[4658]: I1002 11:55:37.924575 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-62jgn"] Oct 02 11:55:37 crc kubenswrapper[4658]: I1002 11:55:37.931681 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:37 crc kubenswrapper[4658]: I1002 11:55:37.940639 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62jgn"] Oct 02 11:55:37 crc kubenswrapper[4658]: I1002 11:55:37.963479 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-catalog-content\") pod \"redhat-marketplace-62jgn\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:37 crc kubenswrapper[4658]: I1002 11:55:37.963545 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-utilities\") pod \"redhat-marketplace-62jgn\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:37 crc kubenswrapper[4658]: I1002 11:55:37.963645 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28pp\" (UniqueName: \"kubernetes.io/projected/eda02377-8b53-4cc6-af5d-7c67cd991d16-kube-api-access-f28pp\") pod \"redhat-marketplace-62jgn\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:38 crc kubenswrapper[4658]: I1002 11:55:38.065399 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-catalog-content\") pod \"redhat-marketplace-62jgn\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:38 crc kubenswrapper[4658]: I1002 11:55:38.065462 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-utilities\") pod \"redhat-marketplace-62jgn\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:38 crc kubenswrapper[4658]: I1002 11:55:38.065539 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28pp\" (UniqueName: \"kubernetes.io/projected/eda02377-8b53-4cc6-af5d-7c67cd991d16-kube-api-access-f28pp\") pod \"redhat-marketplace-62jgn\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:38 crc kubenswrapper[4658]: I1002 11:55:38.066434 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-utilities\") pod \"redhat-marketplace-62jgn\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:38 crc kubenswrapper[4658]: I1002 11:55:38.066469 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-catalog-content\") pod \"redhat-marketplace-62jgn\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:38 crc kubenswrapper[4658]: I1002 11:55:38.089652 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28pp\" (UniqueName: \"kubernetes.io/projected/eda02377-8b53-4cc6-af5d-7c67cd991d16-kube-api-access-f28pp\") pod \"redhat-marketplace-62jgn\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:38 crc kubenswrapper[4658]: I1002 11:55:38.267621 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:38 crc kubenswrapper[4658]: I1002 11:55:38.803599 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62jgn"] Oct 02 11:55:39 crc kubenswrapper[4658]: I1002 11:55:39.301822 4658 generic.go:334] "Generic (PLEG): container finished" podID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerID="d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10" exitCode=0 Oct 02 11:55:39 crc kubenswrapper[4658]: I1002 11:55:39.301904 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62jgn" event={"ID":"eda02377-8b53-4cc6-af5d-7c67cd991d16","Type":"ContainerDied","Data":"d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10"} Oct 02 11:55:39 crc kubenswrapper[4658]: I1002 11:55:39.303581 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62jgn" event={"ID":"eda02377-8b53-4cc6-af5d-7c67cd991d16","Type":"ContainerStarted","Data":"e8efa3e30d6623f33faf58310d70ec032a8979c7fe9f9693bd0f791e1c5d029c"} Oct 02 11:55:39 crc kubenswrapper[4658]: I1002 11:55:39.306629 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:55:42 crc kubenswrapper[4658]: I1002 11:55:42.340185 4658 generic.go:334] "Generic (PLEG): container finished" podID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerID="7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f" exitCode=0 Oct 02 11:55:42 crc kubenswrapper[4658]: I1002 11:55:42.340349 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62jgn" event={"ID":"eda02377-8b53-4cc6-af5d-7c67cd991d16","Type":"ContainerDied","Data":"7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f"} Oct 02 11:55:43 crc kubenswrapper[4658]: I1002 11:55:43.356856 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62jgn" event={"ID":"eda02377-8b53-4cc6-af5d-7c67cd991d16","Type":"ContainerStarted","Data":"9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60"} Oct 02 11:55:43 crc kubenswrapper[4658]: I1002 11:55:43.379340 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-62jgn" podStartSLOduration=2.722570454 podStartE2EDuration="6.379316544s" podCreationTimestamp="2025-10-02 11:55:37 +0000 UTC" firstStartedPulling="2025-10-02 11:55:39.306165991 +0000 UTC m=+2220.197319558" lastFinishedPulling="2025-10-02 11:55:42.962912071 +0000 UTC m=+2223.854065648" observedRunningTime="2025-10-02 11:55:43.376862475 +0000 UTC m=+2224.268016042" watchObservedRunningTime="2025-10-02 11:55:43.379316544 +0000 UTC m=+2224.270470121" Oct 02 11:55:48 crc kubenswrapper[4658]: I1002 11:55:48.268266 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:48 crc kubenswrapper[4658]: I1002 11:55:48.270040 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:48 crc kubenswrapper[4658]: I1002 11:55:48.338423 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:48 crc kubenswrapper[4658]: I1002 11:55:48.469634 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:48 crc kubenswrapper[4658]: I1002 11:55:48.590265 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62jgn"] Oct 02 11:55:50 crc kubenswrapper[4658]: I1002 11:55:50.439417 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-62jgn" podUID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerName="registry-server" containerID="cri-o://9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60" gracePeriod=2 Oct 02 11:55:50 crc kubenswrapper[4658]: I1002 11:55:50.925355 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.050419 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-catalog-content\") pod \"eda02377-8b53-4cc6-af5d-7c67cd991d16\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.050557 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f28pp\" (UniqueName: \"kubernetes.io/projected/eda02377-8b53-4cc6-af5d-7c67cd991d16-kube-api-access-f28pp\") pod \"eda02377-8b53-4cc6-af5d-7c67cd991d16\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.050634 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-utilities\") pod \"eda02377-8b53-4cc6-af5d-7c67cd991d16\" (UID: \"eda02377-8b53-4cc6-af5d-7c67cd991d16\") " Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.051776 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-utilities" (OuterVolumeSpecName: "utilities") pod "eda02377-8b53-4cc6-af5d-7c67cd991d16" (UID: "eda02377-8b53-4cc6-af5d-7c67cd991d16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.057534 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda02377-8b53-4cc6-af5d-7c67cd991d16-kube-api-access-f28pp" (OuterVolumeSpecName: "kube-api-access-f28pp") pod "eda02377-8b53-4cc6-af5d-7c67cd991d16" (UID: "eda02377-8b53-4cc6-af5d-7c67cd991d16"). InnerVolumeSpecName "kube-api-access-f28pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.063515 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eda02377-8b53-4cc6-af5d-7c67cd991d16" (UID: "eda02377-8b53-4cc6-af5d-7c67cd991d16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.153022 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.153054 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda02377-8b53-4cc6-af5d-7c67cd991d16-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.153067 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f28pp\" (UniqueName: \"kubernetes.io/projected/eda02377-8b53-4cc6-af5d-7c67cd991d16-kube-api-access-f28pp\") on node \"crc\" DevicePath \"\"" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.450816 4658 generic.go:334] "Generic (PLEG): container finished" podID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerID="9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60" exitCode=0 Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.450858 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62jgn" event={"ID":"eda02377-8b53-4cc6-af5d-7c67cd991d16","Type":"ContainerDied","Data":"9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60"} Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.450890 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62jgn" event={"ID":"eda02377-8b53-4cc6-af5d-7c67cd991d16","Type":"ContainerDied","Data":"e8efa3e30d6623f33faf58310d70ec032a8979c7fe9f9693bd0f791e1c5d029c"} Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.450902 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62jgn" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.450911 4658 scope.go:117] "RemoveContainer" containerID="9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.476379 4658 scope.go:117] "RemoveContainer" containerID="7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.499918 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62jgn"] Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.515567 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-62jgn"] Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.545355 4658 scope.go:117] "RemoveContainer" containerID="d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.570532 4658 scope.go:117] "RemoveContainer" containerID="9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60" Oct 02 11:55:51 crc kubenswrapper[4658]: E1002 11:55:51.570949 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60\": container with ID starting with 9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60 not found: ID does not exist" containerID="9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.571004 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60"} err="failed to get container status \"9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60\": rpc error: code = NotFound desc = could not find container \"9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60\": container with ID starting with 9e6905d13347ecf8213b8c56b988da22053b3c4e2321d4a08503c63959093b60 not found: ID does not exist" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.571061 4658 scope.go:117] "RemoveContainer" containerID="7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f" Oct 02 11:55:51 crc kubenswrapper[4658]: E1002 11:55:51.571446 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f\": container with ID starting with 7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f not found: ID does not exist" containerID="7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.571478 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f"} err="failed to get container status \"7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f\": rpc error: code = NotFound desc = could not find container \"7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f\": container with ID starting with 7f597e2673330bc8bf233810d2ba70ac8ad4448d9b5ea579dcdb8da37d120c0f not found: ID does not exist" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.571498 4658 scope.go:117] "RemoveContainer" containerID="d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10" Oct 02 11:55:51 crc kubenswrapper[4658]: E1002 11:55:51.574864 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10\": container with ID starting with d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10 not found: ID does not exist" containerID="d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.574905 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10"} err="failed to get container status \"d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10\": rpc error: code = NotFound desc = could not find container \"d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10\": container with ID starting with d4ea96fbdeadcec2a483ceffad77f4ccd22d9ba2ea34f4ed09fae88cae8bad10 not found: ID does not exist" Oct 02 11:55:51 crc kubenswrapper[4658]: I1002 11:55:51.963054 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda02377-8b53-4cc6-af5d-7c67cd991d16" path="/var/lib/kubelet/pods/eda02377-8b53-4cc6-af5d-7c67cd991d16/volumes" Oct 02 11:55:57 crc kubenswrapper[4658]: I1002 11:55:57.430000 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:55:57 crc kubenswrapper[4658]: I1002 11:55:57.430694 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:56:27 crc kubenswrapper[4658]: I1002 11:56:27.430104 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:56:27 crc kubenswrapper[4658]: I1002 11:56:27.430568 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:56:27 crc kubenswrapper[4658]: I1002 11:56:27.430625 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 11:56:27 crc kubenswrapper[4658]: I1002 11:56:27.431225 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:56:27 crc kubenswrapper[4658]: I1002 11:56:27.431275 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" gracePeriod=600 Oct 02 11:56:27 crc kubenswrapper[4658]: E1002 11:56:27.619142 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:56:27 crc kubenswrapper[4658]: I1002 11:56:27.830415 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" exitCode=0 Oct 02 11:56:27 crc kubenswrapper[4658]: I1002 11:56:27.830478 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97"} Oct 02 11:56:27 crc kubenswrapper[4658]: I1002 11:56:27.830522 4658 scope.go:117] "RemoveContainer" containerID="7a47b4f1ee22e57466ef65cda1906555215a872b918f678a1cf99fade8b5c597" Oct 02 11:56:27 crc kubenswrapper[4658]: I1002 11:56:27.831449 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:56:27 crc kubenswrapper[4658]: E1002 11:56:27.831801 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:56:40 crc kubenswrapper[4658]: I1002 11:56:40.949192 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:56:40 crc kubenswrapper[4658]: E1002 11:56:40.951534 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:56:52 crc kubenswrapper[4658]: I1002 11:56:52.949004 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:56:52 crc kubenswrapper[4658]: E1002 11:56:52.949791 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:57:06 crc kubenswrapper[4658]: I1002 11:57:06.949152 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:57:06 crc kubenswrapper[4658]: E1002 11:57:06.950147 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:57:20 crc kubenswrapper[4658]: I1002 11:57:20.949793 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:57:20 crc kubenswrapper[4658]: E1002 11:57:20.950880 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:57:33 crc kubenswrapper[4658]: I1002 11:57:33.949473 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:57:33 crc kubenswrapper[4658]: E1002 11:57:33.950275 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:57:45 crc kubenswrapper[4658]: I1002 11:57:45.950096 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:57:45 crc kubenswrapper[4658]: E1002 11:57:45.951018 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:58:00 crc kubenswrapper[4658]: I1002 11:58:00.950214 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:58:00 crc kubenswrapper[4658]: E1002 11:58:00.951404 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:58:12 crc kubenswrapper[4658]: I1002 11:58:12.950003 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:58:12 crc kubenswrapper[4658]: E1002 11:58:12.950918 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:58:25 crc kubenswrapper[4658]: I1002 11:58:25.951289 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:58:25 crc kubenswrapper[4658]: E1002 11:58:25.952368 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:58:40 crc kubenswrapper[4658]: I1002 11:58:40.948735 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:58:40 crc kubenswrapper[4658]: E1002 11:58:40.949570 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:58:52 crc kubenswrapper[4658]: I1002 11:58:52.387680 4658 generic.go:334] "Generic (PLEG): container finished" podID="074ed90b-9bda-4d7f-819d-41f3e7569ac4" containerID="5b439a052836874521011271246ec9370fe73c72eb786a5deff3e7588c0a69e7" exitCode=0 Oct 02 11:58:52 crc kubenswrapper[4658]: I1002 11:58:52.387768 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" event={"ID":"074ed90b-9bda-4d7f-819d-41f3e7569ac4","Type":"ContainerDied","Data":"5b439a052836874521011271246ec9370fe73c72eb786a5deff3e7588c0a69e7"} Oct 02 11:58:53 crc kubenswrapper[4658]: I1002 11:58:53.934370 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.025947 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmgv9\" (UniqueName: \"kubernetes.io/projected/074ed90b-9bda-4d7f-819d-41f3e7569ac4-kube-api-access-cmgv9\") pod \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.026067 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-combined-ca-bundle\") pod \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.026106 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-inventory\") pod \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.026174 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-ssh-key\") pod \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.026286 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-secret-0\") pod \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\" (UID: \"074ed90b-9bda-4d7f-819d-41f3e7569ac4\") " Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.033077 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "074ed90b-9bda-4d7f-819d-41f3e7569ac4" (UID: "074ed90b-9bda-4d7f-819d-41f3e7569ac4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.042795 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074ed90b-9bda-4d7f-819d-41f3e7569ac4-kube-api-access-cmgv9" (OuterVolumeSpecName: "kube-api-access-cmgv9") pod "074ed90b-9bda-4d7f-819d-41f3e7569ac4" (UID: "074ed90b-9bda-4d7f-819d-41f3e7569ac4"). InnerVolumeSpecName "kube-api-access-cmgv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.054243 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "074ed90b-9bda-4d7f-819d-41f3e7569ac4" (UID: "074ed90b-9bda-4d7f-819d-41f3e7569ac4"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.057116 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "074ed90b-9bda-4d7f-819d-41f3e7569ac4" (UID: "074ed90b-9bda-4d7f-819d-41f3e7569ac4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.067276 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-inventory" (OuterVolumeSpecName: "inventory") pod "074ed90b-9bda-4d7f-819d-41f3e7569ac4" (UID: "074ed90b-9bda-4d7f-819d-41f3e7569ac4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.131779 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmgv9\" (UniqueName: \"kubernetes.io/projected/074ed90b-9bda-4d7f-819d-41f3e7569ac4-kube-api-access-cmgv9\") on node \"crc\" DevicePath \"\"" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.131836 4658 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.131857 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.131877 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.131898 4658 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/074ed90b-9bda-4d7f-819d-41f3e7569ac4-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.414815 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" event={"ID":"074ed90b-9bda-4d7f-819d-41f3e7569ac4","Type":"ContainerDied","Data":"3fa1485ed27171c87028dc8d10bcd4f7f7f055410867ba8fbe0c9537f0e6aa5b"} Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.414862 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fa1485ed27171c87028dc8d10bcd4f7f7f055410867ba8fbe0c9537f0e6aa5b" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.414861 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-59fjq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.530506 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq"] Oct 02 11:58:54 crc kubenswrapper[4658]: E1002 11:58:54.531030 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerName="extract-content" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.531054 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerName="extract-content" Oct 02 11:58:54 crc kubenswrapper[4658]: E1002 11:58:54.531073 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerName="registry-server" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.531083 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerName="registry-server" Oct 02 11:58:54 crc kubenswrapper[4658]: E1002 11:58:54.531105 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerName="extract-utilities" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.531114 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerName="extract-utilities" Oct 02 11:58:54 crc kubenswrapper[4658]: E1002 11:58:54.531135 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074ed90b-9bda-4d7f-819d-41f3e7569ac4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.531145 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="074ed90b-9bda-4d7f-819d-41f3e7569ac4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.531442 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda02377-8b53-4cc6-af5d-7c67cd991d16" containerName="registry-server" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.531471 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="074ed90b-9bda-4d7f-819d-41f3e7569ac4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.532396 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.535212 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.535919 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.535929 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.535974 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.537324 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.552946 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.553249 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.573899 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq"] Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.642694 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.642764 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.642874 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.642912 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.642946 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.643020 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnljd\" (UniqueName: \"kubernetes.io/projected/4d537487-cd7a-43bd-ba29-fc9df6af7913-kube-api-access-lnljd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.643220 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.643368 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.643439 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.745657 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.745734 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.745777 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.745864 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.745903 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.745949 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.745976 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.746005 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.746024 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnljd\" (UniqueName: \"kubernetes.io/projected/4d537487-cd7a-43bd-ba29-fc9df6af7913-kube-api-access-lnljd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.747101 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.750648 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.750904 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.751008 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.751276 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.751901 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.753416 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.755421 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.765064 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnljd\" (UniqueName: \"kubernetes.io/projected/4d537487-cd7a-43bd-ba29-fc9df6af7913-kube-api-access-lnljd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qg2dq\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.858476 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 11:58:54 crc kubenswrapper[4658]: I1002 11:58:54.952767 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:58:54 crc kubenswrapper[4658]: E1002 11:58:54.953099 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:58:55 crc kubenswrapper[4658]: I1002 11:58:55.179169 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq"] Oct 02 11:58:55 crc kubenswrapper[4658]: I1002 11:58:55.425829 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" event={"ID":"4d537487-cd7a-43bd-ba29-fc9df6af7913","Type":"ContainerStarted","Data":"581db6d1fd8875788ea9f037d70702e5b6fa341953d2ae14f3bf9f3b5f0aeb95"} Oct 02 11:58:56 crc kubenswrapper[4658]: I1002 11:58:56.437107 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" event={"ID":"4d537487-cd7a-43bd-ba29-fc9df6af7913","Type":"ContainerStarted","Data":"76a51820c47f1dfd622b3d67de72160554d7c4a221d3461cfff49edfb8bb036e"} Oct 02 11:58:56 crc kubenswrapper[4658]: I1002 11:58:56.467554 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" podStartSLOduration=2.055215961 podStartE2EDuration="2.467520234s" podCreationTimestamp="2025-10-02 11:58:54 +0000 UTC" firstStartedPulling="2025-10-02 11:58:55.183759298 +0000 UTC m=+2416.074912865" lastFinishedPulling="2025-10-02 11:58:55.596063571 +0000 UTC m=+2416.487217138" observedRunningTime="2025-10-02 11:58:56.455715853 +0000 UTC m=+2417.346869460" watchObservedRunningTime="2025-10-02 11:58:56.467520234 +0000 UTC m=+2417.358673841" Oct 02 11:59:06 crc kubenswrapper[4658]: I1002 11:59:06.950130 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:59:06 crc kubenswrapper[4658]: E1002 11:59:06.951599 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:59:19 crc kubenswrapper[4658]: I1002 11:59:19.955149 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:59:19 crc kubenswrapper[4658]: E1002 11:59:19.956515 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:59:31 crc kubenswrapper[4658]: I1002 11:59:31.949658 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:59:31 crc kubenswrapper[4658]: E1002 11:59:31.951242 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:59:46 crc kubenswrapper[4658]: I1002 11:59:46.949480 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:59:46 crc kubenswrapper[4658]: E1002 11:59:46.950188 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 11:59:57 crc kubenswrapper[4658]: I1002 11:59:57.949856 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 11:59:57 crc kubenswrapper[4658]: E1002 11:59:57.950675 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.178801 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58"] Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.185438 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.192355 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlz9w\" (UniqueName: \"kubernetes.io/projected/d7e476dd-9cf3-4c98-8005-3e822bfc1053-kube-api-access-dlz9w\") pod \"collect-profiles-29323440-b6k58\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.192757 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7e476dd-9cf3-4c98-8005-3e822bfc1053-config-volume\") pod \"collect-profiles-29323440-b6k58\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.192829 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7e476dd-9cf3-4c98-8005-3e822bfc1053-secret-volume\") pod \"collect-profiles-29323440-b6k58\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.196321 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58"] Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.200943 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.201664 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.295262 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7e476dd-9cf3-4c98-8005-3e822bfc1053-config-volume\") pod \"collect-profiles-29323440-b6k58\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.295329 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7e476dd-9cf3-4c98-8005-3e822bfc1053-secret-volume\") pod \"collect-profiles-29323440-b6k58\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.295379 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlz9w\" (UniqueName: \"kubernetes.io/projected/d7e476dd-9cf3-4c98-8005-3e822bfc1053-kube-api-access-dlz9w\") pod \"collect-profiles-29323440-b6k58\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.296363 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7e476dd-9cf3-4c98-8005-3e822bfc1053-config-volume\") pod \"collect-profiles-29323440-b6k58\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.302537 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7e476dd-9cf3-4c98-8005-3e822bfc1053-secret-volume\") pod \"collect-profiles-29323440-b6k58\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.312187 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlz9w\" (UniqueName: \"kubernetes.io/projected/d7e476dd-9cf3-4c98-8005-3e822bfc1053-kube-api-access-dlz9w\") pod \"collect-profiles-29323440-b6k58\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:00 crc kubenswrapper[4658]: I1002 12:00:00.528163 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:01 crc kubenswrapper[4658]: I1002 12:00:01.030546 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58"] Oct 02 12:00:01 crc kubenswrapper[4658]: I1002 12:00:01.186055 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" event={"ID":"d7e476dd-9cf3-4c98-8005-3e822bfc1053","Type":"ContainerStarted","Data":"c42eed511f93d8a23aeeda7d6b746349a51282455e889df85fd338f6f427ff07"} Oct 02 12:00:02 crc kubenswrapper[4658]: I1002 12:00:02.199736 4658 generic.go:334] "Generic (PLEG): container finished" podID="d7e476dd-9cf3-4c98-8005-3e822bfc1053" containerID="e125a5c53ca68207489e4af241f5b192d51ffe20153b53733ab606af0810f7bd" exitCode=0 Oct 02 12:00:02 crc kubenswrapper[4658]: I1002 12:00:02.199841 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" event={"ID":"d7e476dd-9cf3-4c98-8005-3e822bfc1053","Type":"ContainerDied","Data":"e125a5c53ca68207489e4af241f5b192d51ffe20153b53733ab606af0810f7bd"} Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.533288 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.666288 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7e476dd-9cf3-4c98-8005-3e822bfc1053-config-volume\") pod \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.666528 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlz9w\" (UniqueName: \"kubernetes.io/projected/d7e476dd-9cf3-4c98-8005-3e822bfc1053-kube-api-access-dlz9w\") pod \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.666723 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7e476dd-9cf3-4c98-8005-3e822bfc1053-secret-volume\") pod \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\" (UID: \"d7e476dd-9cf3-4c98-8005-3e822bfc1053\") " Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.667938 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e476dd-9cf3-4c98-8005-3e822bfc1053-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7e476dd-9cf3-4c98-8005-3e822bfc1053" (UID: "d7e476dd-9cf3-4c98-8005-3e822bfc1053"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.673704 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7e476dd-9cf3-4c98-8005-3e822bfc1053-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d7e476dd-9cf3-4c98-8005-3e822bfc1053" (UID: "d7e476dd-9cf3-4c98-8005-3e822bfc1053"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.685579 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e476dd-9cf3-4c98-8005-3e822bfc1053-kube-api-access-dlz9w" (OuterVolumeSpecName: "kube-api-access-dlz9w") pod "d7e476dd-9cf3-4c98-8005-3e822bfc1053" (UID: "d7e476dd-9cf3-4c98-8005-3e822bfc1053"). InnerVolumeSpecName "kube-api-access-dlz9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.777502 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlz9w\" (UniqueName: \"kubernetes.io/projected/d7e476dd-9cf3-4c98-8005-3e822bfc1053-kube-api-access-dlz9w\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.777936 4658 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7e476dd-9cf3-4c98-8005-3e822bfc1053-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:03 crc kubenswrapper[4658]: I1002 12:00:03.778137 4658 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7e476dd-9cf3-4c98-8005-3e822bfc1053-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:04 crc kubenswrapper[4658]: I1002 12:00:04.224106 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" event={"ID":"d7e476dd-9cf3-4c98-8005-3e822bfc1053","Type":"ContainerDied","Data":"c42eed511f93d8a23aeeda7d6b746349a51282455e889df85fd338f6f427ff07"} Oct 02 12:00:04 crc kubenswrapper[4658]: I1002 12:00:04.224172 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58" Oct 02 12:00:04 crc kubenswrapper[4658]: I1002 12:00:04.224181 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c42eed511f93d8a23aeeda7d6b746349a51282455e889df85fd338f6f427ff07" Oct 02 12:00:04 crc kubenswrapper[4658]: I1002 12:00:04.621271 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z"] Oct 02 12:00:04 crc kubenswrapper[4658]: I1002 12:00:04.635391 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-h867z"] Oct 02 12:00:05 crc kubenswrapper[4658]: I1002 12:00:05.970611 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d11297-76f5-4bdd-a744-57ad6376de77" path="/var/lib/kubelet/pods/c7d11297-76f5-4bdd-a744-57ad6376de77/volumes" Oct 02 12:00:09 crc kubenswrapper[4658]: I1002 12:00:09.957535 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 12:00:09 crc kubenswrapper[4658]: E1002 12:00:09.958627 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:00:23 crc kubenswrapper[4658]: I1002 12:00:23.675121 4658 scope.go:117] "RemoveContainer" containerID="8c6b132cbfb3f52af756472b2c7a2c4f91cce0dfad98bce88f0ef9ff31440d9b" Oct 02 12:00:23 crc kubenswrapper[4658]: I1002 12:00:23.704701 4658 scope.go:117] "RemoveContainer" containerID="bf436456450b87d3ad415542f4e510a51ca4874bb528d2bbe9b5c4d595208f77" Oct 02 12:00:23 crc kubenswrapper[4658]: I1002 12:00:23.783200 4658 scope.go:117] "RemoveContainer" containerID="e5ba68c0abb79fafa459abd278582b539aeb645727f24119389cb207a4c149cd" Oct 02 12:00:23 crc kubenswrapper[4658]: I1002 12:00:23.833162 4658 scope.go:117] "RemoveContainer" containerID="9a7226b06c4f055f9ab622a3fd03cce6126bdcda9b36abf785a5fd45e055e593" Oct 02 12:00:23 crc kubenswrapper[4658]: I1002 12:00:23.948852 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 12:00:23 crc kubenswrapper[4658]: E1002 12:00:23.949232 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:00:38 crc kubenswrapper[4658]: I1002 12:00:38.949288 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 12:00:38 crc kubenswrapper[4658]: E1002 12:00:38.950112 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:00:49 crc kubenswrapper[4658]: I1002 12:00:49.984378 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 12:00:49 crc kubenswrapper[4658]: E1002 12:00:49.985735 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.168882 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29323441-htvr7"] Oct 02 12:01:00 crc kubenswrapper[4658]: E1002 12:01:00.170054 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e476dd-9cf3-4c98-8005-3e822bfc1053" containerName="collect-profiles" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.170073 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e476dd-9cf3-4c98-8005-3e822bfc1053" containerName="collect-profiles" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.170369 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e476dd-9cf3-4c98-8005-3e822bfc1053" containerName="collect-profiles" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.171223 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.180602 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323441-htvr7"] Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.205118 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-config-data\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.205246 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-combined-ca-bundle\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.205462 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-fernet-keys\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.205668 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k77l\" (UniqueName: \"kubernetes.io/projected/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-kube-api-access-5k77l\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.307615 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-fernet-keys\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.307684 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k77l\" (UniqueName: \"kubernetes.io/projected/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-kube-api-access-5k77l\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.307775 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-config-data\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.307821 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-combined-ca-bundle\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.314462 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-combined-ca-bundle\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.314521 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-config-data\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.316036 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-fernet-keys\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.332725 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k77l\" (UniqueName: \"kubernetes.io/projected/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-kube-api-access-5k77l\") pod \"keystone-cron-29323441-htvr7\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.492251 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:00 crc kubenswrapper[4658]: I1002 12:01:00.952964 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323441-htvr7"] Oct 02 12:01:01 crc kubenswrapper[4658]: I1002 12:01:01.831859 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323441-htvr7" event={"ID":"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276","Type":"ContainerStarted","Data":"6cbc77c2a115252161585a8fb5e2c7f14931d797847da109fa36fc1b89981a9f"} Oct 02 12:01:01 crc kubenswrapper[4658]: I1002 12:01:01.832232 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323441-htvr7" event={"ID":"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276","Type":"ContainerStarted","Data":"2928d10b25630e36e9f5b801086f04bdf33cf9c2345d924f245a36375fdc5e7c"} Oct 02 12:01:01 crc kubenswrapper[4658]: I1002 12:01:01.858518 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29323441-htvr7" podStartSLOduration=1.858496339 podStartE2EDuration="1.858496339s" podCreationTimestamp="2025-10-02 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:01:01.85106455 +0000 UTC m=+2542.742218127" watchObservedRunningTime="2025-10-02 12:01:01.858496339 +0000 UTC m=+2542.749649896" Oct 02 12:01:02 crc kubenswrapper[4658]: I1002 12:01:02.950251 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 12:01:02 crc kubenswrapper[4658]: E1002 12:01:02.951004 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:01:03 crc kubenswrapper[4658]: I1002 12:01:03.850712 4658 generic.go:334] "Generic (PLEG): container finished" podID="a39e700e-3d2a-4deb-8ab5-ad53c0cf8276" containerID="6cbc77c2a115252161585a8fb5e2c7f14931d797847da109fa36fc1b89981a9f" exitCode=0 Oct 02 12:01:03 crc kubenswrapper[4658]: I1002 12:01:03.850779 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323441-htvr7" event={"ID":"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276","Type":"ContainerDied","Data":"6cbc77c2a115252161585a8fb5e2c7f14931d797847da109fa36fc1b89981a9f"} Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.227673 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.342209 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-combined-ca-bundle\") pod \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.342338 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-config-data\") pod \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.342442 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k77l\" (UniqueName: \"kubernetes.io/projected/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-kube-api-access-5k77l\") pod \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.342469 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-fernet-keys\") pod \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\" (UID: \"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276\") " Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.347381 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a39e700e-3d2a-4deb-8ab5-ad53c0cf8276" (UID: "a39e700e-3d2a-4deb-8ab5-ad53c0cf8276"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.349256 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-kube-api-access-5k77l" (OuterVolumeSpecName: "kube-api-access-5k77l") pod "a39e700e-3d2a-4deb-8ab5-ad53c0cf8276" (UID: "a39e700e-3d2a-4deb-8ab5-ad53c0cf8276"). InnerVolumeSpecName "kube-api-access-5k77l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.386815 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a39e700e-3d2a-4deb-8ab5-ad53c0cf8276" (UID: "a39e700e-3d2a-4deb-8ab5-ad53c0cf8276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.429812 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-config-data" (OuterVolumeSpecName: "config-data") pod "a39e700e-3d2a-4deb-8ab5-ad53c0cf8276" (UID: "a39e700e-3d2a-4deb-8ab5-ad53c0cf8276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.444465 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.444501 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k77l\" (UniqueName: \"kubernetes.io/projected/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-kube-api-access-5k77l\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.444514 4658 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.444522 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39e700e-3d2a-4deb-8ab5-ad53c0cf8276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.878263 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323441-htvr7" event={"ID":"a39e700e-3d2a-4deb-8ab5-ad53c0cf8276","Type":"ContainerDied","Data":"2928d10b25630e36e9f5b801086f04bdf33cf9c2345d924f245a36375fdc5e7c"} Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.878388 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2928d10b25630e36e9f5b801086f04bdf33cf9c2345d924f245a36375fdc5e7c" Oct 02 12:01:05 crc kubenswrapper[4658]: I1002 12:01:05.878332 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323441-htvr7" Oct 02 12:01:13 crc kubenswrapper[4658]: I1002 12:01:13.949821 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 12:01:13 crc kubenswrapper[4658]: E1002 12:01:13.950727 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:01:28 crc kubenswrapper[4658]: I1002 12:01:28.948885 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 12:01:30 crc kubenswrapper[4658]: I1002 12:01:30.143868 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"d989ab5f5af3825de25ac06ecb779c66d5be9cdd7d7940e539d8e4851ab55f5f"} Oct 02 12:02:32 crc kubenswrapper[4658]: I1002 12:02:32.882452 4658 generic.go:334] "Generic (PLEG): container finished" podID="4d537487-cd7a-43bd-ba29-fc9df6af7913" containerID="76a51820c47f1dfd622b3d67de72160554d7c4a221d3461cfff49edfb8bb036e" exitCode=0 Oct 02 12:02:32 crc kubenswrapper[4658]: I1002 12:02:32.882532 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" event={"ID":"4d537487-cd7a-43bd-ba29-fc9df6af7913","Type":"ContainerDied","Data":"76a51820c47f1dfd622b3d67de72160554d7c4a221d3461cfff49edfb8bb036e"} Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.374414 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.507351 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-inventory\") pod \"4d537487-cd7a-43bd-ba29-fc9df6af7913\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.507443 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-0\") pod \"4d537487-cd7a-43bd-ba29-fc9df6af7913\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.507559 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-ssh-key\") pod \"4d537487-cd7a-43bd-ba29-fc9df6af7913\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.507616 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-1\") pod \"4d537487-cd7a-43bd-ba29-fc9df6af7913\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.507642 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-extra-config-0\") pod \"4d537487-cd7a-43bd-ba29-fc9df6af7913\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.507704 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnljd\" (UniqueName: \"kubernetes.io/projected/4d537487-cd7a-43bd-ba29-fc9df6af7913-kube-api-access-lnljd\") pod \"4d537487-cd7a-43bd-ba29-fc9df6af7913\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.507733 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-1\") pod \"4d537487-cd7a-43bd-ba29-fc9df6af7913\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.507763 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-combined-ca-bundle\") pod \"4d537487-cd7a-43bd-ba29-fc9df6af7913\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.507783 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-0\") pod \"4d537487-cd7a-43bd-ba29-fc9df6af7913\" (UID: \"4d537487-cd7a-43bd-ba29-fc9df6af7913\") " Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.514898 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d537487-cd7a-43bd-ba29-fc9df6af7913-kube-api-access-lnljd" (OuterVolumeSpecName: "kube-api-access-lnljd") pod "4d537487-cd7a-43bd-ba29-fc9df6af7913" (UID: "4d537487-cd7a-43bd-ba29-fc9df6af7913"). InnerVolumeSpecName "kube-api-access-lnljd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.515008 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4d537487-cd7a-43bd-ba29-fc9df6af7913" (UID: "4d537487-cd7a-43bd-ba29-fc9df6af7913"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.538560 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4d537487-cd7a-43bd-ba29-fc9df6af7913" (UID: "4d537487-cd7a-43bd-ba29-fc9df6af7913"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.543242 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4d537487-cd7a-43bd-ba29-fc9df6af7913" (UID: "4d537487-cd7a-43bd-ba29-fc9df6af7913"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.548157 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4d537487-cd7a-43bd-ba29-fc9df6af7913" (UID: "4d537487-cd7a-43bd-ba29-fc9df6af7913"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.548200 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4d537487-cd7a-43bd-ba29-fc9df6af7913" (UID: "4d537487-cd7a-43bd-ba29-fc9df6af7913"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.557140 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-inventory" (OuterVolumeSpecName: "inventory") pod "4d537487-cd7a-43bd-ba29-fc9df6af7913" (UID: "4d537487-cd7a-43bd-ba29-fc9df6af7913"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.565719 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "4d537487-cd7a-43bd-ba29-fc9df6af7913" (UID: "4d537487-cd7a-43bd-ba29-fc9df6af7913"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.566925 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4d537487-cd7a-43bd-ba29-fc9df6af7913" (UID: "4d537487-cd7a-43bd-ba29-fc9df6af7913"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.610346 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.610396 4658 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.610413 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.610431 4658 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.610446 4658 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.610458 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnljd\" (UniqueName: \"kubernetes.io/projected/4d537487-cd7a-43bd-ba29-fc9df6af7913-kube-api-access-lnljd\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.610470 4658 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.610480 4658 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.610489 4658 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d537487-cd7a-43bd-ba29-fc9df6af7913-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.902551 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" event={"ID":"4d537487-cd7a-43bd-ba29-fc9df6af7913","Type":"ContainerDied","Data":"581db6d1fd8875788ea9f037d70702e5b6fa341953d2ae14f3bf9f3b5f0aeb95"} Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.902595 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qg2dq" Oct 02 12:02:34 crc kubenswrapper[4658]: I1002 12:02:34.902609 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="581db6d1fd8875788ea9f037d70702e5b6fa341953d2ae14f3bf9f3b5f0aeb95" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.035628 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp"] Oct 02 12:02:35 crc kubenswrapper[4658]: E1002 12:02:35.036596 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d537487-cd7a-43bd-ba29-fc9df6af7913" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.036733 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d537487-cd7a-43bd-ba29-fc9df6af7913" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 02 12:02:35 crc kubenswrapper[4658]: E1002 12:02:35.036847 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39e700e-3d2a-4deb-8ab5-ad53c0cf8276" containerName="keystone-cron" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.036942 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39e700e-3d2a-4deb-8ab5-ad53c0cf8276" containerName="keystone-cron" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.037371 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39e700e-3d2a-4deb-8ab5-ad53c0cf8276" containerName="keystone-cron" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.037500 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d537487-cd7a-43bd-ba29-fc9df6af7913" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.038600 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.041618 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.046248 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.046257 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wxbtn" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.046382 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.047901 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp"] Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.049540 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.223681 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x5wv\" (UniqueName: \"kubernetes.io/projected/7d923299-fe7c-4ece-8f48-7c95a141f4c8-kube-api-access-5x5wv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.223731 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.223936 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.223993 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.224047 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.224099 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.224173 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.326921 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x5wv\" (UniqueName: \"kubernetes.io/projected/7d923299-fe7c-4ece-8f48-7c95a141f4c8-kube-api-access-5x5wv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.327453 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.327751 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.327946 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.328124 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.328353 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.328586 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.331872 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.331916 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.331928 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.332491 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.332496 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.341948 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.349909 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x5wv\" (UniqueName: \"kubernetes.io/projected/7d923299-fe7c-4ece-8f48-7c95a141f4c8-kube-api-access-5x5wv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.408368 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.987678 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp"] Oct 02 12:02:35 crc kubenswrapper[4658]: I1002 12:02:35.992071 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:02:36 crc kubenswrapper[4658]: I1002 12:02:36.922983 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" event={"ID":"7d923299-fe7c-4ece-8f48-7c95a141f4c8","Type":"ContainerStarted","Data":"890bc24883da544802771f6fff1d6cc4c42658a4098ebd06480c432ce2c0356f"} Oct 02 12:02:36 crc kubenswrapper[4658]: I1002 12:02:36.923311 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" event={"ID":"7d923299-fe7c-4ece-8f48-7c95a141f4c8","Type":"ContainerStarted","Data":"53ab9835d4af1d429847534d233af85cc64a60e29ab6d795c9240eb19aaf40d9"} Oct 02 12:02:36 crc kubenswrapper[4658]: I1002 12:02:36.944718 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" podStartSLOduration=1.390377651 podStartE2EDuration="1.944696255s" podCreationTimestamp="2025-10-02 12:02:35 +0000 UTC" firstStartedPulling="2025-10-02 12:02:35.991792715 +0000 UTC m=+2636.882946282" lastFinishedPulling="2025-10-02 12:02:36.546111319 +0000 UTC m=+2637.437264886" observedRunningTime="2025-10-02 12:02:36.939748186 +0000 UTC m=+2637.830901783" watchObservedRunningTime="2025-10-02 12:02:36.944696255 +0000 UTC m=+2637.835849822" Oct 02 12:03:57 crc kubenswrapper[4658]: I1002 12:03:57.429762 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:03:57 crc kubenswrapper[4658]: I1002 12:03:57.430500 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:04:27 crc kubenswrapper[4658]: I1002 12:04:27.430715 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:04:27 crc kubenswrapper[4658]: I1002 12:04:27.431453 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.547372 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dz5lk"] Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.551941 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.591108 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qwdq\" (UniqueName: \"kubernetes.io/projected/28289dbf-4231-41cd-98d2-9f0046e7fdf8-kube-api-access-2qwdq\") pod \"community-operators-dz5lk\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.591164 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-catalog-content\") pod \"community-operators-dz5lk\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.591341 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-utilities\") pod \"community-operators-dz5lk\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.598691 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dz5lk"] Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.692836 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qwdq\" (UniqueName: \"kubernetes.io/projected/28289dbf-4231-41cd-98d2-9f0046e7fdf8-kube-api-access-2qwdq\") pod \"community-operators-dz5lk\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.692877 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-catalog-content\") pod \"community-operators-dz5lk\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.692943 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-utilities\") pod \"community-operators-dz5lk\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.693531 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-utilities\") pod \"community-operators-dz5lk\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.693576 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-catalog-content\") pod \"community-operators-dz5lk\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.718368 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qwdq\" (UniqueName: \"kubernetes.io/projected/28289dbf-4231-41cd-98d2-9f0046e7fdf8-kube-api-access-2qwdq\") pod \"community-operators-dz5lk\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:50 crc kubenswrapper[4658]: I1002 12:04:50.909716 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:04:51 crc kubenswrapper[4658]: I1002 12:04:51.448169 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dz5lk"] Oct 02 12:04:52 crc kubenswrapper[4658]: I1002 12:04:52.414178 4658 generic.go:334] "Generic (PLEG): container finished" podID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerID="5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2" exitCode=0 Oct 02 12:04:52 crc kubenswrapper[4658]: I1002 12:04:52.414231 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dz5lk" event={"ID":"28289dbf-4231-41cd-98d2-9f0046e7fdf8","Type":"ContainerDied","Data":"5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2"} Oct 02 12:04:52 crc kubenswrapper[4658]: I1002 12:04:52.414592 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dz5lk" event={"ID":"28289dbf-4231-41cd-98d2-9f0046e7fdf8","Type":"ContainerStarted","Data":"320ced30e31e59fee84bf288485734cf6d01da178b453c0f9691ae9636b93305"} Oct 02 12:04:53 crc kubenswrapper[4658]: I1002 12:04:53.428813 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dz5lk" event={"ID":"28289dbf-4231-41cd-98d2-9f0046e7fdf8","Type":"ContainerStarted","Data":"357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a"} Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.327203 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xpcfv"] Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.330070 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.343239 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpcfv"] Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.373451 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-utilities\") pod \"certified-operators-xpcfv\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.373664 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh4sv\" (UniqueName: \"kubernetes.io/projected/26c3369a-6e86-4acc-a67d-46254c99fe19-kube-api-access-zh4sv\") pod \"certified-operators-xpcfv\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.374098 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-catalog-content\") pod \"certified-operators-xpcfv\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.445713 4658 generic.go:334] "Generic (PLEG): container finished" podID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerID="357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a" exitCode=0 Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.445794 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dz5lk" event={"ID":"28289dbf-4231-41cd-98d2-9f0046e7fdf8","Type":"ContainerDied","Data":"357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a"} Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.477341 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-utilities\") pod \"certified-operators-xpcfv\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.477486 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh4sv\" (UniqueName: \"kubernetes.io/projected/26c3369a-6e86-4acc-a67d-46254c99fe19-kube-api-access-zh4sv\") pod \"certified-operators-xpcfv\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.477553 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-catalog-content\") pod \"certified-operators-xpcfv\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.478100 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-catalog-content\") pod \"certified-operators-xpcfv\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.478682 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-utilities\") pod \"certified-operators-xpcfv\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.505257 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh4sv\" (UniqueName: \"kubernetes.io/projected/26c3369a-6e86-4acc-a67d-46254c99fe19-kube-api-access-zh4sv\") pod \"certified-operators-xpcfv\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:54 crc kubenswrapper[4658]: I1002 12:04:54.650921 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:04:55 crc kubenswrapper[4658]: I1002 12:04:55.185023 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpcfv"] Oct 02 12:04:55 crc kubenswrapper[4658]: I1002 12:04:55.455135 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcfv" event={"ID":"26c3369a-6e86-4acc-a67d-46254c99fe19","Type":"ContainerStarted","Data":"959f78ae2121a0bac5adac7d6697a1575df59708ada2fd05f7230c9a05726459"} Oct 02 12:04:56 crc kubenswrapper[4658]: I1002 12:04:56.472192 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dz5lk" event={"ID":"28289dbf-4231-41cd-98d2-9f0046e7fdf8","Type":"ContainerStarted","Data":"6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a"} Oct 02 12:04:56 crc kubenswrapper[4658]: I1002 12:04:56.474348 4658 generic.go:334] "Generic (PLEG): container finished" podID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerID="0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780" exitCode=0 Oct 02 12:04:56 crc kubenswrapper[4658]: I1002 12:04:56.474415 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcfv" event={"ID":"26c3369a-6e86-4acc-a67d-46254c99fe19","Type":"ContainerDied","Data":"0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780"} Oct 02 12:04:56 crc kubenswrapper[4658]: I1002 12:04:56.502823 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dz5lk" podStartSLOduration=3.755231761 podStartE2EDuration="6.502802363s" podCreationTimestamp="2025-10-02 12:04:50 +0000 UTC" firstStartedPulling="2025-10-02 12:04:52.418140406 +0000 UTC m=+2773.309293963" lastFinishedPulling="2025-10-02 12:04:55.165710998 +0000 UTC m=+2776.056864565" observedRunningTime="2025-10-02 12:04:56.49649836 +0000 UTC m=+2777.387651927" watchObservedRunningTime="2025-10-02 12:04:56.502802363 +0000 UTC m=+2777.393955930" Oct 02 12:04:57 crc kubenswrapper[4658]: I1002 12:04:57.430137 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:04:57 crc kubenswrapper[4658]: I1002 12:04:57.430215 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:04:57 crc kubenswrapper[4658]: I1002 12:04:57.430268 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:04:57 crc kubenswrapper[4658]: I1002 12:04:57.431166 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d989ab5f5af3825de25ac06ecb779c66d5be9cdd7d7940e539d8e4851ab55f5f"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:04:57 crc kubenswrapper[4658]: I1002 12:04:57.431243 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://d989ab5f5af3825de25ac06ecb779c66d5be9cdd7d7940e539d8e4851ab55f5f" gracePeriod=600 Oct 02 12:04:58 crc kubenswrapper[4658]: I1002 12:04:58.496648 4658 generic.go:334] "Generic (PLEG): container finished" podID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerID="e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468" exitCode=0 Oct 02 12:04:58 crc kubenswrapper[4658]: I1002 12:04:58.497888 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcfv" event={"ID":"26c3369a-6e86-4acc-a67d-46254c99fe19","Type":"ContainerDied","Data":"e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468"} Oct 02 12:04:58 crc kubenswrapper[4658]: I1002 12:04:58.505140 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="d989ab5f5af3825de25ac06ecb779c66d5be9cdd7d7940e539d8e4851ab55f5f" exitCode=0 Oct 02 12:04:58 crc kubenswrapper[4658]: I1002 12:04:58.505229 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"d989ab5f5af3825de25ac06ecb779c66d5be9cdd7d7940e539d8e4851ab55f5f"} Oct 02 12:04:58 crc kubenswrapper[4658]: I1002 12:04:58.505269 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af"} Oct 02 12:04:58 crc kubenswrapper[4658]: I1002 12:04:58.505332 4658 scope.go:117] "RemoveContainer" containerID="fa9d2f0c2de315fb4b7edd47644905f8c1a0eeeb28a4acc20153c9f34a43cb97" Oct 02 12:04:59 crc kubenswrapper[4658]: I1002 12:04:59.521791 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcfv" event={"ID":"26c3369a-6e86-4acc-a67d-46254c99fe19","Type":"ContainerStarted","Data":"ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d"} Oct 02 12:04:59 crc kubenswrapper[4658]: I1002 12:04:59.558924 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xpcfv" podStartSLOduration=2.898762406 podStartE2EDuration="5.558888842s" podCreationTimestamp="2025-10-02 12:04:54 +0000 UTC" firstStartedPulling="2025-10-02 12:04:56.477539999 +0000 UTC m=+2777.368693566" lastFinishedPulling="2025-10-02 12:04:59.137666435 +0000 UTC m=+2780.028820002" observedRunningTime="2025-10-02 12:04:59.548329822 +0000 UTC m=+2780.439483409" watchObservedRunningTime="2025-10-02 12:04:59.558888842 +0000 UTC m=+2780.450042419" Oct 02 12:05:00 crc kubenswrapper[4658]: I1002 12:05:00.910269 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:05:00 crc kubenswrapper[4658]: I1002 12:05:00.910932 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:05:00 crc kubenswrapper[4658]: I1002 12:05:00.961575 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:05:01 crc kubenswrapper[4658]: I1002 12:05:01.596781 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:05:02 crc kubenswrapper[4658]: I1002 12:05:02.114229 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dz5lk"] Oct 02 12:05:03 crc kubenswrapper[4658]: I1002 12:05:03.563261 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dz5lk" podUID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerName="registry-server" containerID="cri-o://6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a" gracePeriod=2 Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.036427 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.074501 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-utilities\") pod \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.074635 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qwdq\" (UniqueName: \"kubernetes.io/projected/28289dbf-4231-41cd-98d2-9f0046e7fdf8-kube-api-access-2qwdq\") pod \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.074685 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-catalog-content\") pod \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\" (UID: \"28289dbf-4231-41cd-98d2-9f0046e7fdf8\") " Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.075705 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-utilities" (OuterVolumeSpecName: "utilities") pod "28289dbf-4231-41cd-98d2-9f0046e7fdf8" (UID: "28289dbf-4231-41cd-98d2-9f0046e7fdf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.081113 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28289dbf-4231-41cd-98d2-9f0046e7fdf8-kube-api-access-2qwdq" (OuterVolumeSpecName: "kube-api-access-2qwdq") pod "28289dbf-4231-41cd-98d2-9f0046e7fdf8" (UID: "28289dbf-4231-41cd-98d2-9f0046e7fdf8"). InnerVolumeSpecName "kube-api-access-2qwdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.082834 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.083042 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qwdq\" (UniqueName: \"kubernetes.io/projected/28289dbf-4231-41cd-98d2-9f0046e7fdf8-kube-api-access-2qwdq\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.123840 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28289dbf-4231-41cd-98d2-9f0046e7fdf8" (UID: "28289dbf-4231-41cd-98d2-9f0046e7fdf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.185447 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28289dbf-4231-41cd-98d2-9f0046e7fdf8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.573972 4658 generic.go:334] "Generic (PLEG): container finished" podID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerID="6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a" exitCode=0 Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.574030 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dz5lk" event={"ID":"28289dbf-4231-41cd-98d2-9f0046e7fdf8","Type":"ContainerDied","Data":"6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a"} Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.574064 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dz5lk" event={"ID":"28289dbf-4231-41cd-98d2-9f0046e7fdf8","Type":"ContainerDied","Data":"320ced30e31e59fee84bf288485734cf6d01da178b453c0f9691ae9636b93305"} Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.574086 4658 scope.go:117] "RemoveContainer" containerID="6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.574282 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dz5lk" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.597851 4658 scope.go:117] "RemoveContainer" containerID="357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.620676 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dz5lk"] Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.630910 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dz5lk"] Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.636965 4658 scope.go:117] "RemoveContainer" containerID="5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.651153 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.651201 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.690528 4658 scope.go:117] "RemoveContainer" containerID="6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a" Oct 02 12:05:04 crc kubenswrapper[4658]: E1002 12:05:04.690965 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a\": container with ID starting with 6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a not found: ID does not exist" containerID="6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.691009 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a"} err="failed to get container status \"6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a\": rpc error: code = NotFound desc = could not find container \"6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a\": container with ID starting with 6c5f5fa90ef2e3390c74fad94fc9aaa423a2ae5421bc0e1cb713f8f65b07385a not found: ID does not exist" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.691040 4658 scope.go:117] "RemoveContainer" containerID="357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a" Oct 02 12:05:04 crc kubenswrapper[4658]: E1002 12:05:04.691795 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a\": container with ID starting with 357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a not found: ID does not exist" containerID="357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.691855 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a"} err="failed to get container status \"357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a\": rpc error: code = NotFound desc = could not find container \"357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a\": container with ID starting with 357dbd711ba88e592d2288a3d04cb885615e3838faa21cf5427376328a36549a not found: ID does not exist" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.691897 4658 scope.go:117] "RemoveContainer" containerID="5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2" Oct 02 12:05:04 crc kubenswrapper[4658]: E1002 12:05:04.697690 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2\": container with ID starting with 5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2 not found: ID does not exist" containerID="5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.697771 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2"} err="failed to get container status \"5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2\": rpc error: code = NotFound desc = could not find container \"5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2\": container with ID starting with 5e6dddd5528d051ac64717f4d57033a1ac1b373a54412569c4afe95f2a67fcb2 not found: ID does not exist" Oct 02 12:05:04 crc kubenswrapper[4658]: I1002 12:05:04.709251 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:05:05 crc kubenswrapper[4658]: I1002 12:05:05.655887 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:05:05 crc kubenswrapper[4658]: I1002 12:05:05.963456 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" path="/var/lib/kubelet/pods/28289dbf-4231-41cd-98d2-9f0046e7fdf8/volumes" Oct 02 12:05:07 crc kubenswrapper[4658]: I1002 12:05:07.120847 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpcfv"] Oct 02 12:05:07 crc kubenswrapper[4658]: I1002 12:05:07.615250 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xpcfv" podUID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerName="registry-server" containerID="cri-o://ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d" gracePeriod=2 Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.612363 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.625104 4658 generic.go:334] "Generic (PLEG): container finished" podID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerID="ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d" exitCode=0 Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.625142 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcfv" event={"ID":"26c3369a-6e86-4acc-a67d-46254c99fe19","Type":"ContainerDied","Data":"ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d"} Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.625168 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcfv" event={"ID":"26c3369a-6e86-4acc-a67d-46254c99fe19","Type":"ContainerDied","Data":"959f78ae2121a0bac5adac7d6697a1575df59708ada2fd05f7230c9a05726459"} Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.625173 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpcfv" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.625185 4658 scope.go:117] "RemoveContainer" containerID="ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.655335 4658 scope.go:117] "RemoveContainer" containerID="e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.681724 4658 scope.go:117] "RemoveContainer" containerID="0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.742507 4658 scope.go:117] "RemoveContainer" containerID="ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d" Oct 02 12:05:08 crc kubenswrapper[4658]: E1002 12:05:08.743058 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d\": container with ID starting with ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d not found: ID does not exist" containerID="ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.743110 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d"} err="failed to get container status \"ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d\": rpc error: code = NotFound desc = could not find container \"ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d\": container with ID starting with ef5fcf534f2a838ee71078ab3b0b9c5cd63dd04c935925d813c0aae18313631d not found: ID does not exist" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.743140 4658 scope.go:117] "RemoveContainer" containerID="e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468" Oct 02 12:05:08 crc kubenswrapper[4658]: E1002 12:05:08.743726 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468\": container with ID starting with e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468 not found: ID does not exist" containerID="e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.743782 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468"} err="failed to get container status \"e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468\": rpc error: code = NotFound desc = could not find container \"e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468\": container with ID starting with e2a98174dca0271fecca8b7d6abf72a5b4406c1081042c0c420e97c32b425468 not found: ID does not exist" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.743819 4658 scope.go:117] "RemoveContainer" containerID="0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780" Oct 02 12:05:08 crc kubenswrapper[4658]: E1002 12:05:08.744139 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780\": container with ID starting with 0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780 not found: ID does not exist" containerID="0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.744179 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780"} err="failed to get container status \"0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780\": rpc error: code = NotFound desc = could not find container \"0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780\": container with ID starting with 0417799163cfd4e39164db5011ac7bc5a309dd1b2482278841e2932e9293a780 not found: ID does not exist" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.791731 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-catalog-content\") pod \"26c3369a-6e86-4acc-a67d-46254c99fe19\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.792007 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh4sv\" (UniqueName: \"kubernetes.io/projected/26c3369a-6e86-4acc-a67d-46254c99fe19-kube-api-access-zh4sv\") pod \"26c3369a-6e86-4acc-a67d-46254c99fe19\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.792064 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-utilities\") pod \"26c3369a-6e86-4acc-a67d-46254c99fe19\" (UID: \"26c3369a-6e86-4acc-a67d-46254c99fe19\") " Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.793153 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-utilities" (OuterVolumeSpecName: "utilities") pod "26c3369a-6e86-4acc-a67d-46254c99fe19" (UID: "26c3369a-6e86-4acc-a67d-46254c99fe19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.797787 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c3369a-6e86-4acc-a67d-46254c99fe19-kube-api-access-zh4sv" (OuterVolumeSpecName: "kube-api-access-zh4sv") pod "26c3369a-6e86-4acc-a67d-46254c99fe19" (UID: "26c3369a-6e86-4acc-a67d-46254c99fe19"). InnerVolumeSpecName "kube-api-access-zh4sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.848671 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26c3369a-6e86-4acc-a67d-46254c99fe19" (UID: "26c3369a-6e86-4acc-a67d-46254c99fe19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.895494 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.895550 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh4sv\" (UniqueName: \"kubernetes.io/projected/26c3369a-6e86-4acc-a67d-46254c99fe19-kube-api-access-zh4sv\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.895572 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c3369a-6e86-4acc-a67d-46254c99fe19-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.971612 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpcfv"] Oct 02 12:05:08 crc kubenswrapper[4658]: I1002 12:05:08.987465 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xpcfv"] Oct 02 12:05:09 crc kubenswrapper[4658]: I1002 12:05:09.963940 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c3369a-6e86-4acc-a67d-46254c99fe19" path="/var/lib/kubelet/pods/26c3369a-6e86-4acc-a67d-46254c99fe19/volumes" Oct 02 12:05:17 crc kubenswrapper[4658]: I1002 12:05:17.714162 4658 generic.go:334] "Generic (PLEG): container finished" podID="7d923299-fe7c-4ece-8f48-7c95a141f4c8" containerID="890bc24883da544802771f6fff1d6cc4c42658a4098ebd06480c432ce2c0356f" exitCode=0 Oct 02 12:05:17 crc kubenswrapper[4658]: I1002 12:05:17.714362 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" event={"ID":"7d923299-fe7c-4ece-8f48-7c95a141f4c8","Type":"ContainerDied","Data":"890bc24883da544802771f6fff1d6cc4c42658a4098ebd06480c432ce2c0356f"} Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.226863 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.420048 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x5wv\" (UniqueName: \"kubernetes.io/projected/7d923299-fe7c-4ece-8f48-7c95a141f4c8-kube-api-access-5x5wv\") pod \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.420100 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-2\") pod \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.420182 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ssh-key\") pod \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.420248 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-telemetry-combined-ca-bundle\") pod \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.420279 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-1\") pod \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.420297 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-0\") pod \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.420370 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-inventory\") pod \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\" (UID: \"7d923299-fe7c-4ece-8f48-7c95a141f4c8\") " Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.427726 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d923299-fe7c-4ece-8f48-7c95a141f4c8-kube-api-access-5x5wv" (OuterVolumeSpecName: "kube-api-access-5x5wv") pod "7d923299-fe7c-4ece-8f48-7c95a141f4c8" (UID: "7d923299-fe7c-4ece-8f48-7c95a141f4c8"). InnerVolumeSpecName "kube-api-access-5x5wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.429703 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7d923299-fe7c-4ece-8f48-7c95a141f4c8" (UID: "7d923299-fe7c-4ece-8f48-7c95a141f4c8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.455228 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7d923299-fe7c-4ece-8f48-7c95a141f4c8" (UID: "7d923299-fe7c-4ece-8f48-7c95a141f4c8"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.456568 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7d923299-fe7c-4ece-8f48-7c95a141f4c8" (UID: "7d923299-fe7c-4ece-8f48-7c95a141f4c8"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.457414 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7d923299-fe7c-4ece-8f48-7c95a141f4c8" (UID: "7d923299-fe7c-4ece-8f48-7c95a141f4c8"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.468169 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-inventory" (OuterVolumeSpecName: "inventory") pod "7d923299-fe7c-4ece-8f48-7c95a141f4c8" (UID: "7d923299-fe7c-4ece-8f48-7c95a141f4c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.478433 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7d923299-fe7c-4ece-8f48-7c95a141f4c8" (UID: "7d923299-fe7c-4ece-8f48-7c95a141f4c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.523038 4658 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.523083 4658 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.523097 4658 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.523111 4658 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.523124 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x5wv\" (UniqueName: \"kubernetes.io/projected/7d923299-fe7c-4ece-8f48-7c95a141f4c8-kube-api-access-5x5wv\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.523134 4658 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.523145 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d923299-fe7c-4ece-8f48-7c95a141f4c8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.742246 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" event={"ID":"7d923299-fe7c-4ece-8f48-7c95a141f4c8","Type":"ContainerDied","Data":"53ab9835d4af1d429847534d233af85cc64a60e29ab6d795c9240eb19aaf40d9"} Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.742324 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53ab9835d4af1d429847534d233af85cc64a60e29ab6d795c9240eb19aaf40d9" Oct 02 12:05:19 crc kubenswrapper[4658]: I1002 12:05:19.742330 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp" Oct 02 12:05:59 crc kubenswrapper[4658]: I1002 12:05:59.085435 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:05:59 crc kubenswrapper[4658]: I1002 12:05:59.086535 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="prometheus" containerID="cri-o://e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1" gracePeriod=600 Oct 02 12:05:59 crc kubenswrapper[4658]: I1002 12:05:59.089538 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="thanos-sidecar" containerID="cri-o://cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81" gracePeriod=600 Oct 02 12:05:59 crc kubenswrapper[4658]: I1002 12:05:59.089736 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="config-reloader" containerID="cri-o://952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687" gracePeriod=600 Oct 02 12:05:59 crc kubenswrapper[4658]: I1002 12:05:59.611732 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.144:9090/-/ready\": dial tcp 10.217.0.144:9090: connect: connection refused" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.036201 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.179676 4658 generic.go:334] "Generic (PLEG): container finished" podID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerID="cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81" exitCode=0 Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.179730 4658 generic.go:334] "Generic (PLEG): container finished" podID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerID="952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687" exitCode=0 Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.179746 4658 generic.go:334] "Generic (PLEG): container finished" podID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerID="e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1" exitCode=0 Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.179775 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerDied","Data":"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81"} Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.179812 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerDied","Data":"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687"} Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.179831 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerDied","Data":"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1"} Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.179831 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.179861 4658 scope.go:117] "RemoveContainer" containerID="cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.179846 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63b3416a-79b7-450d-a7aa-42c1747c5c55","Type":"ContainerDied","Data":"594ac99825969811f912c326bec4d19b5d12b4d2a44d82bbcb00083665d5a981"} Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.204453 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-tls-assets\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.204714 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-config\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.204765 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.204886 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-thanos-prometheus-http-client-file\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.204919 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.204944 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63b3416a-79b7-450d-a7aa-42c1747c5c55-prometheus-metric-storage-rulefiles-0\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.205008 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63b3416a-79b7-450d-a7aa-42c1747c5c55-config-out\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.205030 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs92g\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-kube-api-access-xs92g\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.205138 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.205229 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-secret-combined-ca-bundle\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.205262 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config\") pod \"63b3416a-79b7-450d-a7aa-42c1747c5c55\" (UID: \"63b3416a-79b7-450d-a7aa-42c1747c5c55\") " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.205993 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63b3416a-79b7-450d-a7aa-42c1747c5c55-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.216981 4658 scope.go:117] "RemoveContainer" containerID="952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.218171 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-kube-api-access-xs92g" (OuterVolumeSpecName: "kube-api-access-xs92g") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "kube-api-access-xs92g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.218385 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.218470 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.218909 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.233960 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-config" (OuterVolumeSpecName: "config") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.234028 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.234092 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.250209 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63b3416a-79b7-450d-a7aa-42c1747c5c55-config-out" (OuterVolumeSpecName: "config-out") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.260318 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "pvc-d812d300-651e-49c4-ad99-6713da3d5cbd". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.308612 4658 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.309177 4658 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.309192 4658 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.309209 4658 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.309229 4658 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.309241 4658 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.309257 4658 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63b3416a-79b7-450d-a7aa-42c1747c5c55-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.309268 4658 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63b3416a-79b7-450d-a7aa-42c1747c5c55-config-out\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.309279 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs92g\" (UniqueName: \"kubernetes.io/projected/63b3416a-79b7-450d-a7aa-42c1747c5c55-kube-api-access-xs92g\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.309343 4658 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") on node \"crc\" " Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.330405 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config" (OuterVolumeSpecName: "web-config") pod "63b3416a-79b7-450d-a7aa-42c1747c5c55" (UID: "63b3416a-79b7-450d-a7aa-42c1747c5c55"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.355332 4658 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.355663 4658 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d812d300-651e-49c4-ad99-6713da3d5cbd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd") on node "crc" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.411253 4658 scope.go:117] "RemoveContainer" containerID="e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.411925 4658 reconciler_common.go:293] "Volume detached for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.412045 4658 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63b3416a-79b7-450d-a7aa-42c1747c5c55-web-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.436010 4658 scope.go:117] "RemoveContainer" containerID="82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.461974 4658 scope.go:117] "RemoveContainer" containerID="cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.462738 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81\": container with ID starting with cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81 not found: ID does not exist" containerID="cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.462858 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81"} err="failed to get container status \"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81\": rpc error: code = NotFound desc = could not find container \"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81\": container with ID starting with cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81 not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.462961 4658 scope.go:117] "RemoveContainer" containerID="952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.463648 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687\": container with ID starting with 952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687 not found: ID does not exist" containerID="952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.463773 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687"} err="failed to get container status \"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687\": rpc error: code = NotFound desc = could not find container \"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687\": container with ID starting with 952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687 not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.463870 4658 scope.go:117] "RemoveContainer" containerID="e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.464475 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1\": container with ID starting with e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1 not found: ID does not exist" containerID="e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.464585 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1"} err="failed to get container status \"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1\": rpc error: code = NotFound desc = could not find container \"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1\": container with ID starting with e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1 not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.464674 4658 scope.go:117] "RemoveContainer" containerID="82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.465912 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde\": container with ID starting with 82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde not found: ID does not exist" containerID="82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.465978 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde"} err="failed to get container status \"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde\": rpc error: code = NotFound desc = could not find container \"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde\": container with ID starting with 82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.466031 4658 scope.go:117] "RemoveContainer" containerID="cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.466508 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81"} err="failed to get container status \"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81\": rpc error: code = NotFound desc = could not find container \"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81\": container with ID starting with cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81 not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.466546 4658 scope.go:117] "RemoveContainer" containerID="952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.466919 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687"} err="failed to get container status \"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687\": rpc error: code = NotFound desc = could not find container \"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687\": container with ID starting with 952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687 not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.466945 4658 scope.go:117] "RemoveContainer" containerID="e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.467427 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1"} err="failed to get container status \"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1\": rpc error: code = NotFound desc = could not find container \"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1\": container with ID starting with e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1 not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.467550 4658 scope.go:117] "RemoveContainer" containerID="82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.467971 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde"} err="failed to get container status \"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde\": rpc error: code = NotFound desc = could not find container \"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde\": container with ID starting with 82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.467999 4658 scope.go:117] "RemoveContainer" containerID="cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.468273 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81"} err="failed to get container status \"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81\": rpc error: code = NotFound desc = could not find container \"cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81\": container with ID starting with cd0369931a16a39a1341a3903e4a4c8614d75d44770ee2e90d9ea4f928e1dc81 not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.468316 4658 scope.go:117] "RemoveContainer" containerID="952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.468631 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687"} err="failed to get container status \"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687\": rpc error: code = NotFound desc = could not find container \"952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687\": container with ID starting with 952c132a14dc804f2f95a4925c3cbe825b466667ed13aebd0ad9d27c6f876687 not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.468653 4658 scope.go:117] "RemoveContainer" containerID="e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.468917 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1"} err="failed to get container status \"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1\": rpc error: code = NotFound desc = could not find container \"e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1\": container with ID starting with e3fc2205b36757df6820d9923521a3de8dbcb7dc8499fd3a12a617d3824aa0f1 not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.468945 4658 scope.go:117] "RemoveContainer" containerID="82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.469253 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde"} err="failed to get container status \"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde\": rpc error: code = NotFound desc = could not find container \"82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde\": container with ID starting with 82bd94a23ba53e45a8b898247e7e73fc24cd56183978387c260a3d9d78834dde not found: ID does not exist" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.528227 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.538818 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.564926 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565533 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d923299-fe7c-4ece-8f48-7c95a141f4c8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565559 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d923299-fe7c-4ece-8f48-7c95a141f4c8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565587 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="prometheus" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565598 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="prometheus" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565623 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="init-config-reloader" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565633 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="init-config-reloader" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565646 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerName="extract-content" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565654 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerName="extract-content" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565671 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerName="registry-server" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565680 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerName="registry-server" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565700 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerName="extract-utilities" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565710 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerName="extract-utilities" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565725 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerName="registry-server" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565736 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerName="registry-server" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565746 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerName="extract-content" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565755 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerName="extract-content" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565767 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerName="extract-utilities" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565775 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerName="extract-utilities" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565795 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="config-reloader" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565803 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="config-reloader" Oct 02 12:06:00 crc kubenswrapper[4658]: E1002 12:06:00.565818 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="thanos-sidecar" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.565827 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="thanos-sidecar" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.566054 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="thanos-sidecar" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.566071 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="config-reloader" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.566082 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c3369a-6e86-4acc-a67d-46254c99fe19" containerName="registry-server" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.566106 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d923299-fe7c-4ece-8f48-7c95a141f4c8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.566130 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" containerName="prometheus" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.566150 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="28289dbf-4231-41cd-98d2-9f0046e7fdf8" containerName="registry-server" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.584193 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.587078 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.587250 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5lk2c" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.594003 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.594144 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.597417 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.612313 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.613117 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.717755 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b8e966f-7f02-41e2-8022-99deb47a8c93-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.717829 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.717868 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.717906 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.717929 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.717961 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b8e966f-7f02-41e2-8022-99deb47a8c93-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.717977 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.718006 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.718027 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7b8e966f-7f02-41e2-8022-99deb47a8c93-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.718046 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-827kz\" (UniqueName: \"kubernetes.io/projected/7b8e966f-7f02-41e2-8022-99deb47a8c93-kube-api-access-827kz\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.718090 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.819700 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.819777 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.819805 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.819844 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b8e966f-7f02-41e2-8022-99deb47a8c93-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.819867 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.819906 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.819934 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7b8e966f-7f02-41e2-8022-99deb47a8c93-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.819957 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-827kz\" (UniqueName: \"kubernetes.io/projected/7b8e966f-7f02-41e2-8022-99deb47a8c93-kube-api-access-827kz\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.819986 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.820034 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b8e966f-7f02-41e2-8022-99deb47a8c93-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.820074 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.823417 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7b8e966f-7f02-41e2-8022-99deb47a8c93-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.823713 4658 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.823776 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b2727623f8bbe474018a880a77329ded2fae90762c86c59a9726b562d3cbf13f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.824552 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b8e966f-7f02-41e2-8022-99deb47a8c93-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.825661 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b8e966f-7f02-41e2-8022-99deb47a8c93-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.826316 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.826600 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.826795 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.827150 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.828638 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.834442 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b8e966f-7f02-41e2-8022-99deb47a8c93-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.855339 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-827kz\" (UniqueName: \"kubernetes.io/projected/7b8e966f-7f02-41e2-8022-99deb47a8c93-kube-api-access-827kz\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.863548 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d812d300-651e-49c4-ad99-6713da3d5cbd\") pod \"prometheus-metric-storage-0\" (UID: \"7b8e966f-7f02-41e2-8022-99deb47a8c93\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:00 crc kubenswrapper[4658]: I1002 12:06:00.910583 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:01 crc kubenswrapper[4658]: I1002 12:06:01.358532 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:06:01 crc kubenswrapper[4658]: I1002 12:06:01.967426 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63b3416a-79b7-450d-a7aa-42c1747c5c55" path="/var/lib/kubelet/pods/63b3416a-79b7-450d-a7aa-42c1747c5c55/volumes" Oct 02 12:06:02 crc kubenswrapper[4658]: I1002 12:06:02.201187 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b8e966f-7f02-41e2-8022-99deb47a8c93","Type":"ContainerStarted","Data":"9d95789aae2a34385e3fa2c1945e4c5c78beae61c79c6e746e4f5fccdb292147"} Oct 02 12:06:06 crc kubenswrapper[4658]: I1002 12:06:06.249516 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b8e966f-7f02-41e2-8022-99deb47a8c93","Type":"ContainerStarted","Data":"69a5c00bfa9a80bba7914e42871dea048f4269c16fab7647f83406ce453540e9"} Oct 02 12:06:14 crc kubenswrapper[4658]: I1002 12:06:14.350176 4658 generic.go:334] "Generic (PLEG): container finished" podID="7b8e966f-7f02-41e2-8022-99deb47a8c93" containerID="69a5c00bfa9a80bba7914e42871dea048f4269c16fab7647f83406ce453540e9" exitCode=0 Oct 02 12:06:14 crc kubenswrapper[4658]: I1002 12:06:14.350682 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b8e966f-7f02-41e2-8022-99deb47a8c93","Type":"ContainerDied","Data":"69a5c00bfa9a80bba7914e42871dea048f4269c16fab7647f83406ce453540e9"} Oct 02 12:06:15 crc kubenswrapper[4658]: I1002 12:06:15.366905 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b8e966f-7f02-41e2-8022-99deb47a8c93","Type":"ContainerStarted","Data":"6af2da4ba493cca823bda39064c5c91c1335f1b94660a617755b5c19c6c56deb"} Oct 02 12:06:18 crc kubenswrapper[4658]: I1002 12:06:18.406241 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b8e966f-7f02-41e2-8022-99deb47a8c93","Type":"ContainerStarted","Data":"1c6c58152759634beb7693069aa59567c49b9b72d3a4ac7f3044b246574f8308"} Oct 02 12:06:18 crc kubenswrapper[4658]: I1002 12:06:18.406757 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b8e966f-7f02-41e2-8022-99deb47a8c93","Type":"ContainerStarted","Data":"6e045ba6f7b1b6768a03b4c59e3fb6be1fc87ede4f6f6923138b67ea58423f4a"} Oct 02 12:06:18 crc kubenswrapper[4658]: I1002 12:06:18.435936 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.43591945 podStartE2EDuration="18.43591945s" podCreationTimestamp="2025-10-02 12:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:06:18.427501929 +0000 UTC m=+2859.318655506" watchObservedRunningTime="2025-10-02 12:06:18.43591945 +0000 UTC m=+2859.327073017" Oct 02 12:06:20 crc kubenswrapper[4658]: I1002 12:06:20.911519 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.417198 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s7gvg"] Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.421585 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.427428 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7gvg"] Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.538871 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l286d\" (UniqueName: \"kubernetes.io/projected/74d1769e-b4e7-4fe1-9e23-a0181f9de404-kube-api-access-l286d\") pod \"redhat-marketplace-s7gvg\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.538947 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-catalog-content\") pod \"redhat-marketplace-s7gvg\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.539009 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-utilities\") pod \"redhat-marketplace-s7gvg\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.640777 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l286d\" (UniqueName: \"kubernetes.io/projected/74d1769e-b4e7-4fe1-9e23-a0181f9de404-kube-api-access-l286d\") pod \"redhat-marketplace-s7gvg\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.640860 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-catalog-content\") pod \"redhat-marketplace-s7gvg\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.640939 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-utilities\") pod \"redhat-marketplace-s7gvg\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.641368 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-catalog-content\") pod \"redhat-marketplace-s7gvg\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.641565 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-utilities\") pod \"redhat-marketplace-s7gvg\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.665859 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l286d\" (UniqueName: \"kubernetes.io/projected/74d1769e-b4e7-4fe1-9e23-a0181f9de404-kube-api-access-l286d\") pod \"redhat-marketplace-s7gvg\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.746528 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.913318 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:30 crc kubenswrapper[4658]: I1002 12:06:30.927376 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:31 crc kubenswrapper[4658]: I1002 12:06:31.220127 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7gvg"] Oct 02 12:06:31 crc kubenswrapper[4658]: I1002 12:06:31.557425 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7gvg" event={"ID":"74d1769e-b4e7-4fe1-9e23-a0181f9de404","Type":"ContainerStarted","Data":"9fbf60ccd5e0bdddcd04b77fd1bab1d1b2ee1ae4da921cf58f612cfca2aa5762"} Oct 02 12:06:31 crc kubenswrapper[4658]: I1002 12:06:31.561785 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.211532 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t6s7m"] Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.215362 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.219769 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6s7m"] Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.378651 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-utilities\") pod \"redhat-operators-t6s7m\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.378753 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-catalog-content\") pod \"redhat-operators-t6s7m\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.378854 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwctl\" (UniqueName: \"kubernetes.io/projected/89ab72b1-2697-4ba1-9903-28178145aab4-kube-api-access-mwctl\") pod \"redhat-operators-t6s7m\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.480589 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-utilities\") pod \"redhat-operators-t6s7m\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.480652 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-catalog-content\") pod \"redhat-operators-t6s7m\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.480733 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwctl\" (UniqueName: \"kubernetes.io/projected/89ab72b1-2697-4ba1-9903-28178145aab4-kube-api-access-mwctl\") pod \"redhat-operators-t6s7m\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.481245 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-utilities\") pod \"redhat-operators-t6s7m\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.481331 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-catalog-content\") pod \"redhat-operators-t6s7m\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.502420 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwctl\" (UniqueName: \"kubernetes.io/projected/89ab72b1-2697-4ba1-9903-28178145aab4-kube-api-access-mwctl\") pod \"redhat-operators-t6s7m\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.546255 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.573097 4658 generic.go:334] "Generic (PLEG): container finished" podID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerID="959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa" exitCode=0 Oct 02 12:06:32 crc kubenswrapper[4658]: I1002 12:06:32.573228 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7gvg" event={"ID":"74d1769e-b4e7-4fe1-9e23-a0181f9de404","Type":"ContainerDied","Data":"959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa"} Oct 02 12:06:33 crc kubenswrapper[4658]: I1002 12:06:33.037346 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6s7m"] Oct 02 12:06:33 crc kubenswrapper[4658]: I1002 12:06:33.583280 4658 generic.go:334] "Generic (PLEG): container finished" podID="89ab72b1-2697-4ba1-9903-28178145aab4" containerID="a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41" exitCode=0 Oct 02 12:06:33 crc kubenswrapper[4658]: I1002 12:06:33.583406 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6s7m" event={"ID":"89ab72b1-2697-4ba1-9903-28178145aab4","Type":"ContainerDied","Data":"a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41"} Oct 02 12:06:33 crc kubenswrapper[4658]: I1002 12:06:33.583834 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6s7m" event={"ID":"89ab72b1-2697-4ba1-9903-28178145aab4","Type":"ContainerStarted","Data":"d7fe04d0e7dd966b597ea755a089ac4c6cf51c3e2c065b75de3659694c37806e"} Oct 02 12:06:34 crc kubenswrapper[4658]: I1002 12:06:34.600329 4658 generic.go:334] "Generic (PLEG): container finished" podID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerID="ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45" exitCode=0 Oct 02 12:06:34 crc kubenswrapper[4658]: I1002 12:06:34.600409 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7gvg" event={"ID":"74d1769e-b4e7-4fe1-9e23-a0181f9de404","Type":"ContainerDied","Data":"ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45"} Oct 02 12:06:35 crc kubenswrapper[4658]: I1002 12:06:35.614007 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7gvg" event={"ID":"74d1769e-b4e7-4fe1-9e23-a0181f9de404","Type":"ContainerStarted","Data":"8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9"} Oct 02 12:06:35 crc kubenswrapper[4658]: I1002 12:06:35.620904 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6s7m" event={"ID":"89ab72b1-2697-4ba1-9903-28178145aab4","Type":"ContainerDied","Data":"dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6"} Oct 02 12:06:35 crc kubenswrapper[4658]: I1002 12:06:35.622356 4658 generic.go:334] "Generic (PLEG): container finished" podID="89ab72b1-2697-4ba1-9903-28178145aab4" containerID="dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6" exitCode=0 Oct 02 12:06:35 crc kubenswrapper[4658]: I1002 12:06:35.642098 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s7gvg" podStartSLOduration=3.131258001 podStartE2EDuration="5.64207659s" podCreationTimestamp="2025-10-02 12:06:30 +0000 UTC" firstStartedPulling="2025-10-02 12:06:32.576370701 +0000 UTC m=+2873.467524268" lastFinishedPulling="2025-10-02 12:06:35.08718929 +0000 UTC m=+2875.978342857" observedRunningTime="2025-10-02 12:06:35.638426404 +0000 UTC m=+2876.529580001" watchObservedRunningTime="2025-10-02 12:06:35.64207659 +0000 UTC m=+2876.533230157" Oct 02 12:06:37 crc kubenswrapper[4658]: I1002 12:06:37.665878 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6s7m" event={"ID":"89ab72b1-2697-4ba1-9903-28178145aab4","Type":"ContainerStarted","Data":"a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2"} Oct 02 12:06:37 crc kubenswrapper[4658]: I1002 12:06:37.693079 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t6s7m" podStartSLOduration=2.860306022 podStartE2EDuration="5.693059128s" podCreationTimestamp="2025-10-02 12:06:32 +0000 UTC" firstStartedPulling="2025-10-02 12:06:33.607333226 +0000 UTC m=+2874.498486793" lastFinishedPulling="2025-10-02 12:06:36.440086292 +0000 UTC m=+2877.331239899" observedRunningTime="2025-10-02 12:06:37.682964933 +0000 UTC m=+2878.574118510" watchObservedRunningTime="2025-10-02 12:06:37.693059128 +0000 UTC m=+2878.584212705" Oct 02 12:06:40 crc kubenswrapper[4658]: I1002 12:06:40.746776 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:40 crc kubenswrapper[4658]: I1002 12:06:40.747125 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:40 crc kubenswrapper[4658]: I1002 12:06:40.796835 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:41 crc kubenswrapper[4658]: I1002 12:06:41.775505 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:41 crc kubenswrapper[4658]: I1002 12:06:41.822287 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7gvg"] Oct 02 12:06:42 crc kubenswrapper[4658]: I1002 12:06:42.548044 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:42 crc kubenswrapper[4658]: I1002 12:06:42.549422 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:42 crc kubenswrapper[4658]: I1002 12:06:42.602404 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:42 crc kubenswrapper[4658]: I1002 12:06:42.755830 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:43 crc kubenswrapper[4658]: I1002 12:06:43.721054 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s7gvg" podUID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerName="registry-server" containerID="cri-o://8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9" gracePeriod=2 Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.007848 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6s7m"] Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.273445 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.437800 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-utilities\") pod \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.437967 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l286d\" (UniqueName: \"kubernetes.io/projected/74d1769e-b4e7-4fe1-9e23-a0181f9de404-kube-api-access-l286d\") pod \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.438037 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-catalog-content\") pod \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\" (UID: \"74d1769e-b4e7-4fe1-9e23-a0181f9de404\") " Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.438625 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-utilities" (OuterVolumeSpecName: "utilities") pod "74d1769e-b4e7-4fe1-9e23-a0181f9de404" (UID: "74d1769e-b4e7-4fe1-9e23-a0181f9de404"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.444086 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d1769e-b4e7-4fe1-9e23-a0181f9de404-kube-api-access-l286d" (OuterVolumeSpecName: "kube-api-access-l286d") pod "74d1769e-b4e7-4fe1-9e23-a0181f9de404" (UID: "74d1769e-b4e7-4fe1-9e23-a0181f9de404"). InnerVolumeSpecName "kube-api-access-l286d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.453684 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74d1769e-b4e7-4fe1-9e23-a0181f9de404" (UID: "74d1769e-b4e7-4fe1-9e23-a0181f9de404"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.539940 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.540253 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l286d\" (UniqueName: \"kubernetes.io/projected/74d1769e-b4e7-4fe1-9e23-a0181f9de404-kube-api-access-l286d\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.540263 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d1769e-b4e7-4fe1-9e23-a0181f9de404-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.733786 4658 generic.go:334] "Generic (PLEG): container finished" podID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerID="8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9" exitCode=0 Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.733847 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7gvg" event={"ID":"74d1769e-b4e7-4fe1-9e23-a0181f9de404","Type":"ContainerDied","Data":"8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9"} Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.733894 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s7gvg" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.733923 4658 scope.go:117] "RemoveContainer" containerID="8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.733905 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7gvg" event={"ID":"74d1769e-b4e7-4fe1-9e23-a0181f9de404","Type":"ContainerDied","Data":"9fbf60ccd5e0bdddcd04b77fd1bab1d1b2ee1ae4da921cf58f612cfca2aa5762"} Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.768637 4658 scope.go:117] "RemoveContainer" containerID="ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.798978 4658 scope.go:117] "RemoveContainer" containerID="959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.799008 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7gvg"] Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.820734 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7gvg"] Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.851320 4658 scope.go:117] "RemoveContainer" containerID="8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9" Oct 02 12:06:44 crc kubenswrapper[4658]: E1002 12:06:44.852573 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9\": container with ID starting with 8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9 not found: ID does not exist" containerID="8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.852724 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9"} err="failed to get container status \"8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9\": rpc error: code = NotFound desc = could not find container \"8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9\": container with ID starting with 8cb6615c3a5d3eb9f07a3cb696d50ddf8fa41ec9080b7f27a976e5ac36bb5cc9 not found: ID does not exist" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.852838 4658 scope.go:117] "RemoveContainer" containerID="ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45" Oct 02 12:06:44 crc kubenswrapper[4658]: E1002 12:06:44.853278 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45\": container with ID starting with ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45 not found: ID does not exist" containerID="ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.853338 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45"} err="failed to get container status \"ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45\": rpc error: code = NotFound desc = could not find container \"ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45\": container with ID starting with ec43b2059cd834c9f55603c5bb89e282e66f3c8fc51de7d8afbba6107d06ad45 not found: ID does not exist" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.853367 4658 scope.go:117] "RemoveContainer" containerID="959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa" Oct 02 12:06:44 crc kubenswrapper[4658]: E1002 12:06:44.853652 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa\": container with ID starting with 959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa not found: ID does not exist" containerID="959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa" Oct 02 12:06:44 crc kubenswrapper[4658]: I1002 12:06:44.853691 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa"} err="failed to get container status \"959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa\": rpc error: code = NotFound desc = could not find container \"959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa\": container with ID starting with 959fab65da296c19eae4b41824e442505c34538ab269865346e979507d0ec2fa not found: ID does not exist" Oct 02 12:06:45 crc kubenswrapper[4658]: I1002 12:06:45.744761 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t6s7m" podUID="89ab72b1-2697-4ba1-9903-28178145aab4" containerName="registry-server" containerID="cri-o://a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2" gracePeriod=2 Oct 02 12:06:45 crc kubenswrapper[4658]: I1002 12:06:45.976059 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" path="/var/lib/kubelet/pods/74d1769e-b4e7-4fe1-9e23-a0181f9de404/volumes" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.286586 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.482399 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-utilities\") pod \"89ab72b1-2697-4ba1-9903-28178145aab4\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.482527 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-catalog-content\") pod \"89ab72b1-2697-4ba1-9903-28178145aab4\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.482682 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwctl\" (UniqueName: \"kubernetes.io/projected/89ab72b1-2697-4ba1-9903-28178145aab4-kube-api-access-mwctl\") pod \"89ab72b1-2697-4ba1-9903-28178145aab4\" (UID: \"89ab72b1-2697-4ba1-9903-28178145aab4\") " Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.483486 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-utilities" (OuterVolumeSpecName: "utilities") pod "89ab72b1-2697-4ba1-9903-28178145aab4" (UID: "89ab72b1-2697-4ba1-9903-28178145aab4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.484148 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.493957 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ab72b1-2697-4ba1-9903-28178145aab4-kube-api-access-mwctl" (OuterVolumeSpecName: "kube-api-access-mwctl") pod "89ab72b1-2697-4ba1-9903-28178145aab4" (UID: "89ab72b1-2697-4ba1-9903-28178145aab4"). InnerVolumeSpecName "kube-api-access-mwctl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.572174 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89ab72b1-2697-4ba1-9903-28178145aab4" (UID: "89ab72b1-2697-4ba1-9903-28178145aab4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.585953 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ab72b1-2697-4ba1-9903-28178145aab4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.585990 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwctl\" (UniqueName: \"kubernetes.io/projected/89ab72b1-2697-4ba1-9903-28178145aab4-kube-api-access-mwctl\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.755751 4658 generic.go:334] "Generic (PLEG): container finished" podID="89ab72b1-2697-4ba1-9903-28178145aab4" containerID="a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2" exitCode=0 Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.755792 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6s7m" event={"ID":"89ab72b1-2697-4ba1-9903-28178145aab4","Type":"ContainerDied","Data":"a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2"} Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.755820 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6s7m" event={"ID":"89ab72b1-2697-4ba1-9903-28178145aab4","Type":"ContainerDied","Data":"d7fe04d0e7dd966b597ea755a089ac4c6cf51c3e2c065b75de3659694c37806e"} Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.755841 4658 scope.go:117] "RemoveContainer" containerID="a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.755946 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6s7m" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.782781 4658 scope.go:117] "RemoveContainer" containerID="dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.790210 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6s7m"] Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.807013 4658 scope.go:117] "RemoveContainer" containerID="a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.808688 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t6s7m"] Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.854478 4658 scope.go:117] "RemoveContainer" containerID="a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2" Oct 02 12:06:46 crc kubenswrapper[4658]: E1002 12:06:46.854945 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2\": container with ID starting with a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2 not found: ID does not exist" containerID="a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.854996 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2"} err="failed to get container status \"a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2\": rpc error: code = NotFound desc = could not find container \"a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2\": container with ID starting with a6c2554ab78dacc771b2c71efc6aba7cf0e942bccdcb258d757329ed0cd1d7e2 not found: ID does not exist" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.855025 4658 scope.go:117] "RemoveContainer" containerID="dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6" Oct 02 12:06:46 crc kubenswrapper[4658]: E1002 12:06:46.855333 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6\": container with ID starting with dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6 not found: ID does not exist" containerID="dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.855373 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6"} err="failed to get container status \"dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6\": rpc error: code = NotFound desc = could not find container \"dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6\": container with ID starting with dcdff0dc19560d084ee25c13cc90229ea7920b79b15f9026d871a6f477c995f6 not found: ID does not exist" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.855572 4658 scope.go:117] "RemoveContainer" containerID="a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41" Oct 02 12:06:46 crc kubenswrapper[4658]: E1002 12:06:46.856000 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41\": container with ID starting with a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41 not found: ID does not exist" containerID="a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41" Oct 02 12:06:46 crc kubenswrapper[4658]: I1002 12:06:46.856090 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41"} err="failed to get container status \"a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41\": rpc error: code = NotFound desc = could not find container \"a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41\": container with ID starting with a15b54702fc13a7e1cb92dff78a2a5c6d1cb6e3294305cf7b2c6fd55cb84bf41 not found: ID does not exist" Oct 02 12:06:47 crc kubenswrapper[4658]: I1002 12:06:47.970035 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ab72b1-2697-4ba1-9903-28178145aab4" path="/var/lib/kubelet/pods/89ab72b1-2697-4ba1-9903-28178145aab4/volumes" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.430195 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.431014 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.494841 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 12:06:57 crc kubenswrapper[4658]: E1002 12:06:57.495360 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ab72b1-2697-4ba1-9903-28178145aab4" containerName="extract-utilities" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.495380 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ab72b1-2697-4ba1-9903-28178145aab4" containerName="extract-utilities" Oct 02 12:06:57 crc kubenswrapper[4658]: E1002 12:06:57.495395 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerName="registry-server" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.495402 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerName="registry-server" Oct 02 12:06:57 crc kubenswrapper[4658]: E1002 12:06:57.495418 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerName="extract-utilities" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.495429 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerName="extract-utilities" Oct 02 12:06:57 crc kubenswrapper[4658]: E1002 12:06:57.495458 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ab72b1-2697-4ba1-9903-28178145aab4" containerName="extract-content" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.495464 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ab72b1-2697-4ba1-9903-28178145aab4" containerName="extract-content" Oct 02 12:06:57 crc kubenswrapper[4658]: E1002 12:06:57.495488 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ab72b1-2697-4ba1-9903-28178145aab4" containerName="registry-server" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.495494 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ab72b1-2697-4ba1-9903-28178145aab4" containerName="registry-server" Oct 02 12:06:57 crc kubenswrapper[4658]: E1002 12:06:57.495511 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerName="extract-content" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.495516 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerName="extract-content" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.495705 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ab72b1-2697-4ba1-9903-28178145aab4" containerName="registry-server" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.495729 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d1769e-b4e7-4fe1-9e23-a0181f9de404" containerName="registry-server" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.496460 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.499236 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.499238 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kt87g" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.499419 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.499441 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.507712 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.603063 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.603330 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcb8r\" (UniqueName: \"kubernetes.io/projected/fd9ceedd-f5a7-425a-9112-998edc1d3e00-kube-api-access-hcb8r\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.603518 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.603571 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.603721 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.603776 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-config-data\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.603818 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.603971 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.604019 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.705774 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcb8r\" (UniqueName: \"kubernetes.io/projected/fd9ceedd-f5a7-425a-9112-998edc1d3e00-kube-api-access-hcb8r\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.705868 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.705907 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.705974 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.706020 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-config-data\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.706046 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.706137 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.706178 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.706243 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.707052 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.707526 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.707528 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.707588 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.708388 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-config-data\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.713235 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.713434 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.713741 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.728665 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcb8r\" (UniqueName: \"kubernetes.io/projected/fd9ceedd-f5a7-425a-9112-998edc1d3e00-kube-api-access-hcb8r\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.736399 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " pod="openstack/tempest-tests-tempest" Oct 02 12:06:57 crc kubenswrapper[4658]: I1002 12:06:57.830026 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 12:06:58 crc kubenswrapper[4658]: I1002 12:06:58.291528 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 12:06:58 crc kubenswrapper[4658]: I1002 12:06:58.889357 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fd9ceedd-f5a7-425a-9112-998edc1d3e00","Type":"ContainerStarted","Data":"6f8ecffc94a678d95013b478002df36759405cc22d0725995edecf9f36328ee7"} Oct 02 12:07:09 crc kubenswrapper[4658]: I1002 12:07:09.012534 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fd9ceedd-f5a7-425a-9112-998edc1d3e00","Type":"ContainerStarted","Data":"2eeff51e8f7d15bc6e32c95d28eda08c055da1b1e799602917a584e943d080f9"} Oct 02 12:07:09 crc kubenswrapper[4658]: I1002 12:07:09.036463 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.570800428 podStartE2EDuration="13.036438786s" podCreationTimestamp="2025-10-02 12:06:56 +0000 UTC" firstStartedPulling="2025-10-02 12:06:58.27674955 +0000 UTC m=+2899.167903117" lastFinishedPulling="2025-10-02 12:07:07.742387888 +0000 UTC m=+2908.633541475" observedRunningTime="2025-10-02 12:07:09.034775843 +0000 UTC m=+2909.925929460" watchObservedRunningTime="2025-10-02 12:07:09.036438786 +0000 UTC m=+2909.927592383" Oct 02 12:07:27 crc kubenswrapper[4658]: I1002 12:07:27.430230 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:07:27 crc kubenswrapper[4658]: I1002 12:07:27.432661 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:07:57 crc kubenswrapper[4658]: I1002 12:07:57.430262 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:07:57 crc kubenswrapper[4658]: I1002 12:07:57.430796 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:07:57 crc kubenswrapper[4658]: I1002 12:07:57.430849 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:07:57 crc kubenswrapper[4658]: I1002 12:07:57.431708 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:07:57 crc kubenswrapper[4658]: I1002 12:07:57.431781 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" gracePeriod=600 Oct 02 12:07:57 crc kubenswrapper[4658]: E1002 12:07:57.570062 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:07:58 crc kubenswrapper[4658]: I1002 12:07:58.549107 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" exitCode=0 Oct 02 12:07:58 crc kubenswrapper[4658]: I1002 12:07:58.549151 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af"} Oct 02 12:07:58 crc kubenswrapper[4658]: I1002 12:07:58.549206 4658 scope.go:117] "RemoveContainer" containerID="d989ab5f5af3825de25ac06ecb779c66d5be9cdd7d7940e539d8e4851ab55f5f" Oct 02 12:07:58 crc kubenswrapper[4658]: I1002 12:07:58.549888 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:07:58 crc kubenswrapper[4658]: E1002 12:07:58.550314 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:08:09 crc kubenswrapper[4658]: I1002 12:08:09.965323 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:08:09 crc kubenswrapper[4658]: E1002 12:08:09.966167 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:08:21 crc kubenswrapper[4658]: I1002 12:08:21.949946 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:08:21 crc kubenswrapper[4658]: E1002 12:08:21.950768 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:08:35 crc kubenswrapper[4658]: I1002 12:08:35.950440 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:08:35 crc kubenswrapper[4658]: E1002 12:08:35.951666 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:08:47 crc kubenswrapper[4658]: I1002 12:08:47.950184 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:08:47 crc kubenswrapper[4658]: E1002 12:08:47.951311 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:09:00 crc kubenswrapper[4658]: I1002 12:09:00.948759 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:09:00 crc kubenswrapper[4658]: E1002 12:09:00.949645 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:09:15 crc kubenswrapper[4658]: I1002 12:09:15.949692 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:09:15 crc kubenswrapper[4658]: E1002 12:09:15.950587 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:09:28 crc kubenswrapper[4658]: I1002 12:09:28.949083 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:09:28 crc kubenswrapper[4658]: E1002 12:09:28.949925 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:09:42 crc kubenswrapper[4658]: I1002 12:09:42.948823 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:09:42 crc kubenswrapper[4658]: E1002 12:09:42.949636 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:09:53 crc kubenswrapper[4658]: I1002 12:09:53.949250 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:09:53 crc kubenswrapper[4658]: E1002 12:09:53.950035 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:10:06 crc kubenswrapper[4658]: I1002 12:10:06.949614 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:10:06 crc kubenswrapper[4658]: E1002 12:10:06.950773 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:10:21 crc kubenswrapper[4658]: I1002 12:10:21.951169 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:10:21 crc kubenswrapper[4658]: E1002 12:10:21.951930 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:10:33 crc kubenswrapper[4658]: I1002 12:10:33.950581 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:10:33 crc kubenswrapper[4658]: E1002 12:10:33.951220 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:10:48 crc kubenswrapper[4658]: I1002 12:10:48.949553 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:10:48 crc kubenswrapper[4658]: E1002 12:10:48.950465 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:11:00 crc kubenswrapper[4658]: I1002 12:11:00.949906 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:11:00 crc kubenswrapper[4658]: E1002 12:11:00.951463 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:11:13 crc kubenswrapper[4658]: I1002 12:11:13.949019 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:11:13 crc kubenswrapper[4658]: E1002 12:11:13.949825 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:11:24 crc kubenswrapper[4658]: I1002 12:11:24.949453 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:11:24 crc kubenswrapper[4658]: E1002 12:11:24.950434 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:11:36 crc kubenswrapper[4658]: I1002 12:11:36.949809 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:11:36 crc kubenswrapper[4658]: E1002 12:11:36.950732 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:11:51 crc kubenswrapper[4658]: I1002 12:11:51.949607 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:11:51 crc kubenswrapper[4658]: E1002 12:11:51.950639 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:12:02 crc kubenswrapper[4658]: I1002 12:12:02.973244 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:12:02 crc kubenswrapper[4658]: E1002 12:12:02.974048 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:12:13 crc kubenswrapper[4658]: I1002 12:12:13.949325 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:12:13 crc kubenswrapper[4658]: E1002 12:12:13.950183 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:12:25 crc kubenswrapper[4658]: I1002 12:12:25.949729 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:12:25 crc kubenswrapper[4658]: E1002 12:12:25.950795 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:12:36 crc kubenswrapper[4658]: I1002 12:12:36.949708 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:12:36 crc kubenswrapper[4658]: E1002 12:12:36.950759 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:12:51 crc kubenswrapper[4658]: I1002 12:12:51.949756 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:12:51 crc kubenswrapper[4658]: E1002 12:12:51.951209 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:13:02 crc kubenswrapper[4658]: I1002 12:13:02.949191 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:13:03 crc kubenswrapper[4658]: I1002 12:13:03.667834 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"b6e3d7f53f7f649211086bc471036b0b4ebd0378bc6474fc30dc0f2fa04fc98b"} Oct 02 12:13:30 crc kubenswrapper[4658]: E1002 12:13:30.152238 4658 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.32:38620->38.102.83.32:41677: write tcp 38.102.83.32:38620->38.102.83.32:41677: write: broken pipe Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.185001 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg"] Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.187045 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.193432 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.195256 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.200516 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg"] Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.295383 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll42g\" (UniqueName: \"kubernetes.io/projected/221bb351-32c3-4da4-8cb1-92f3ec37e89d-kube-api-access-ll42g\") pod \"collect-profiles-29323455-k26jg\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.295752 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/221bb351-32c3-4da4-8cb1-92f3ec37e89d-config-volume\") pod \"collect-profiles-29323455-k26jg\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.295907 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/221bb351-32c3-4da4-8cb1-92f3ec37e89d-secret-volume\") pod \"collect-profiles-29323455-k26jg\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.397812 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll42g\" (UniqueName: \"kubernetes.io/projected/221bb351-32c3-4da4-8cb1-92f3ec37e89d-kube-api-access-ll42g\") pod \"collect-profiles-29323455-k26jg\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.397892 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/221bb351-32c3-4da4-8cb1-92f3ec37e89d-config-volume\") pod \"collect-profiles-29323455-k26jg\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.397944 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/221bb351-32c3-4da4-8cb1-92f3ec37e89d-secret-volume\") pod \"collect-profiles-29323455-k26jg\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.399117 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/221bb351-32c3-4da4-8cb1-92f3ec37e89d-config-volume\") pod \"collect-profiles-29323455-k26jg\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.407011 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/221bb351-32c3-4da4-8cb1-92f3ec37e89d-secret-volume\") pod \"collect-profiles-29323455-k26jg\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.420465 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll42g\" (UniqueName: \"kubernetes.io/projected/221bb351-32c3-4da4-8cb1-92f3ec37e89d-kube-api-access-ll42g\") pod \"collect-profiles-29323455-k26jg\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.514702 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:00 crc kubenswrapper[4658]: I1002 12:15:00.957676 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg"] Oct 02 12:15:01 crc kubenswrapper[4658]: I1002 12:15:01.756800 4658 generic.go:334] "Generic (PLEG): container finished" podID="221bb351-32c3-4da4-8cb1-92f3ec37e89d" containerID="ef08a02f063a1e7f4dc3284d3fa4c987a6164000a93a30cc5f6ec86006e38b01" exitCode=0 Oct 02 12:15:01 crc kubenswrapper[4658]: I1002 12:15:01.756857 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" event={"ID":"221bb351-32c3-4da4-8cb1-92f3ec37e89d","Type":"ContainerDied","Data":"ef08a02f063a1e7f4dc3284d3fa4c987a6164000a93a30cc5f6ec86006e38b01"} Oct 02 12:15:01 crc kubenswrapper[4658]: I1002 12:15:01.757084 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" event={"ID":"221bb351-32c3-4da4-8cb1-92f3ec37e89d","Type":"ContainerStarted","Data":"34c36cb4a3bc7e51f45115038550365646f5c60b1e6ca7324dcf352a9e2532b4"} Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.119884 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.260754 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/221bb351-32c3-4da4-8cb1-92f3ec37e89d-secret-volume\") pod \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.260834 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll42g\" (UniqueName: \"kubernetes.io/projected/221bb351-32c3-4da4-8cb1-92f3ec37e89d-kube-api-access-ll42g\") pod \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.260900 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/221bb351-32c3-4da4-8cb1-92f3ec37e89d-config-volume\") pod \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\" (UID: \"221bb351-32c3-4da4-8cb1-92f3ec37e89d\") " Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.262008 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/221bb351-32c3-4da4-8cb1-92f3ec37e89d-config-volume" (OuterVolumeSpecName: "config-volume") pod "221bb351-32c3-4da4-8cb1-92f3ec37e89d" (UID: "221bb351-32c3-4da4-8cb1-92f3ec37e89d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.266846 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/221bb351-32c3-4da4-8cb1-92f3ec37e89d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "221bb351-32c3-4da4-8cb1-92f3ec37e89d" (UID: "221bb351-32c3-4da4-8cb1-92f3ec37e89d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.267430 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221bb351-32c3-4da4-8cb1-92f3ec37e89d-kube-api-access-ll42g" (OuterVolumeSpecName: "kube-api-access-ll42g") pod "221bb351-32c3-4da4-8cb1-92f3ec37e89d" (UID: "221bb351-32c3-4da4-8cb1-92f3ec37e89d"). InnerVolumeSpecName "kube-api-access-ll42g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.363260 4658 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/221bb351-32c3-4da4-8cb1-92f3ec37e89d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.363321 4658 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/221bb351-32c3-4da4-8cb1-92f3ec37e89d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.363331 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll42g\" (UniqueName: \"kubernetes.io/projected/221bb351-32c3-4da4-8cb1-92f3ec37e89d-kube-api-access-ll42g\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.778492 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" event={"ID":"221bb351-32c3-4da4-8cb1-92f3ec37e89d","Type":"ContainerDied","Data":"34c36cb4a3bc7e51f45115038550365646f5c60b1e6ca7324dcf352a9e2532b4"} Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.778531 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34c36cb4a3bc7e51f45115038550365646f5c60b1e6ca7324dcf352a9e2532b4" Oct 02 12:15:03 crc kubenswrapper[4658]: I1002 12:15:03.778586 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg" Oct 02 12:15:04 crc kubenswrapper[4658]: I1002 12:15:04.202163 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr"] Oct 02 12:15:04 crc kubenswrapper[4658]: I1002 12:15:04.210756 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-4tkcr"] Oct 02 12:15:05 crc kubenswrapper[4658]: I1002 12:15:05.967075 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa8c328-ed90-4500-866b-f66b33ad5528" path="/var/lib/kubelet/pods/ffa8c328-ed90-4500-866b-f66b33ad5528/volumes" Oct 02 12:15:24 crc kubenswrapper[4658]: I1002 12:15:24.330968 4658 scope.go:117] "RemoveContainer" containerID="8441418dca09dd20a05408ca9d560ef275ee4372d85ff0c3e149bab9fb3e199c" Oct 02 12:15:27 crc kubenswrapper[4658]: I1002 12:15:27.430389 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:15:27 crc kubenswrapper[4658]: I1002 12:15:27.430973 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.763594 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zn8nc"] Oct 02 12:15:34 crc kubenswrapper[4658]: E1002 12:15:34.764715 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221bb351-32c3-4da4-8cb1-92f3ec37e89d" containerName="collect-profiles" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.764734 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="221bb351-32c3-4da4-8cb1-92f3ec37e89d" containerName="collect-profiles" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.764972 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="221bb351-32c3-4da4-8cb1-92f3ec37e89d" containerName="collect-profiles" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.766726 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.773819 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zn8nc"] Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.806327 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-utilities\") pod \"certified-operators-zn8nc\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.806370 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9prjg\" (UniqueName: \"kubernetes.io/projected/6457197b-c482-42cc-b5af-ec5ac57e1042-kube-api-access-9prjg\") pod \"certified-operators-zn8nc\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.806424 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-catalog-content\") pod \"certified-operators-zn8nc\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.908644 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-catalog-content\") pod \"certified-operators-zn8nc\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.908813 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-utilities\") pod \"certified-operators-zn8nc\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.908854 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9prjg\" (UniqueName: \"kubernetes.io/projected/6457197b-c482-42cc-b5af-ec5ac57e1042-kube-api-access-9prjg\") pod \"certified-operators-zn8nc\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.909099 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-catalog-content\") pod \"certified-operators-zn8nc\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.909333 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-utilities\") pod \"certified-operators-zn8nc\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:34 crc kubenswrapper[4658]: I1002 12:15:34.930218 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9prjg\" (UniqueName: \"kubernetes.io/projected/6457197b-c482-42cc-b5af-ec5ac57e1042-kube-api-access-9prjg\") pod \"certified-operators-zn8nc\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:35 crc kubenswrapper[4658]: I1002 12:15:35.092960 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:35 crc kubenswrapper[4658]: I1002 12:15:35.650479 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zn8nc"] Oct 02 12:15:36 crc kubenswrapper[4658]: I1002 12:15:36.098822 4658 generic.go:334] "Generic (PLEG): container finished" podID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerID="bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6" exitCode=0 Oct 02 12:15:36 crc kubenswrapper[4658]: I1002 12:15:36.098896 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zn8nc" event={"ID":"6457197b-c482-42cc-b5af-ec5ac57e1042","Type":"ContainerDied","Data":"bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6"} Oct 02 12:15:36 crc kubenswrapper[4658]: I1002 12:15:36.098929 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zn8nc" event={"ID":"6457197b-c482-42cc-b5af-ec5ac57e1042","Type":"ContainerStarted","Data":"111dfe3889048bab8aa9417400ced6aa47827af0cb454fb6bad6610513635d25"} Oct 02 12:15:36 crc kubenswrapper[4658]: I1002 12:15:36.102244 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:15:38 crc kubenswrapper[4658]: I1002 12:15:38.121477 4658 generic.go:334] "Generic (PLEG): container finished" podID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerID="cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089" exitCode=0 Oct 02 12:15:38 crc kubenswrapper[4658]: I1002 12:15:38.121530 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zn8nc" event={"ID":"6457197b-c482-42cc-b5af-ec5ac57e1042","Type":"ContainerDied","Data":"cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089"} Oct 02 12:15:39 crc kubenswrapper[4658]: I1002 12:15:39.134081 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zn8nc" event={"ID":"6457197b-c482-42cc-b5af-ec5ac57e1042","Type":"ContainerStarted","Data":"094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f"} Oct 02 12:15:39 crc kubenswrapper[4658]: I1002 12:15:39.164818 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zn8nc" podStartSLOduration=2.6000953879999997 podStartE2EDuration="5.164795306s" podCreationTimestamp="2025-10-02 12:15:34 +0000 UTC" firstStartedPulling="2025-10-02 12:15:36.101581099 +0000 UTC m=+3416.992734666" lastFinishedPulling="2025-10-02 12:15:38.666281017 +0000 UTC m=+3419.557434584" observedRunningTime="2025-10-02 12:15:39.1562166 +0000 UTC m=+3420.047370237" watchObservedRunningTime="2025-10-02 12:15:39.164795306 +0000 UTC m=+3420.055948883" Oct 02 12:15:45 crc kubenswrapper[4658]: I1002 12:15:45.093651 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:45 crc kubenswrapper[4658]: I1002 12:15:45.094124 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:45 crc kubenswrapper[4658]: I1002 12:15:45.143909 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:45 crc kubenswrapper[4658]: I1002 12:15:45.244856 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:45 crc kubenswrapper[4658]: I1002 12:15:45.378936 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zn8nc"] Oct 02 12:15:47 crc kubenswrapper[4658]: I1002 12:15:47.214826 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zn8nc" podUID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerName="registry-server" containerID="cri-o://094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f" gracePeriod=2 Oct 02 12:15:47 crc kubenswrapper[4658]: I1002 12:15:47.806242 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:47 crc kubenswrapper[4658]: I1002 12:15:47.921985 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-utilities\") pod \"6457197b-c482-42cc-b5af-ec5ac57e1042\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " Oct 02 12:15:47 crc kubenswrapper[4658]: I1002 12:15:47.922169 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9prjg\" (UniqueName: \"kubernetes.io/projected/6457197b-c482-42cc-b5af-ec5ac57e1042-kube-api-access-9prjg\") pod \"6457197b-c482-42cc-b5af-ec5ac57e1042\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " Oct 02 12:15:47 crc kubenswrapper[4658]: I1002 12:15:47.922263 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-catalog-content\") pod \"6457197b-c482-42cc-b5af-ec5ac57e1042\" (UID: \"6457197b-c482-42cc-b5af-ec5ac57e1042\") " Oct 02 12:15:47 crc kubenswrapper[4658]: I1002 12:15:47.923040 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-utilities" (OuterVolumeSpecName: "utilities") pod "6457197b-c482-42cc-b5af-ec5ac57e1042" (UID: "6457197b-c482-42cc-b5af-ec5ac57e1042"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:15:47 crc kubenswrapper[4658]: I1002 12:15:47.933588 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6457197b-c482-42cc-b5af-ec5ac57e1042-kube-api-access-9prjg" (OuterVolumeSpecName: "kube-api-access-9prjg") pod "6457197b-c482-42cc-b5af-ec5ac57e1042" (UID: "6457197b-c482-42cc-b5af-ec5ac57e1042"). InnerVolumeSpecName "kube-api-access-9prjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:48 crc kubenswrapper[4658]: I1002 12:15:48.024635 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9prjg\" (UniqueName: \"kubernetes.io/projected/6457197b-c482-42cc-b5af-ec5ac57e1042-kube-api-access-9prjg\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:48 crc kubenswrapper[4658]: I1002 12:15:48.024677 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.226246 4658 generic.go:334] "Generic (PLEG): container finished" podID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerID="094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f" exitCode=0 Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.226288 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zn8nc" event={"ID":"6457197b-c482-42cc-b5af-ec5ac57e1042","Type":"ContainerDied","Data":"094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f"} Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.226347 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zn8nc" event={"ID":"6457197b-c482-42cc-b5af-ec5ac57e1042","Type":"ContainerDied","Data":"111dfe3889048bab8aa9417400ced6aa47827af0cb454fb6bad6610513635d25"} Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.226374 4658 scope.go:117] "RemoveContainer" containerID="094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.226583 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zn8nc" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.257040 4658 scope.go:117] "RemoveContainer" containerID="cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.282674 4658 scope.go:117] "RemoveContainer" containerID="bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.345123 4658 scope.go:117] "RemoveContainer" containerID="094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f" Oct 02 12:15:49 crc kubenswrapper[4658]: E1002 12:15:48.345607 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f\": container with ID starting with 094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f not found: ID does not exist" containerID="094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.345639 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f"} err="failed to get container status \"094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f\": rpc error: code = NotFound desc = could not find container \"094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f\": container with ID starting with 094561bcf1d0d40da8d0640172e466f469e21d47736ed527117512301fadd00f not found: ID does not exist" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.345665 4658 scope.go:117] "RemoveContainer" containerID="cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089" Oct 02 12:15:49 crc kubenswrapper[4658]: E1002 12:15:48.346277 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089\": container with ID starting with cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089 not found: ID does not exist" containerID="cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.346579 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089"} err="failed to get container status \"cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089\": rpc error: code = NotFound desc = could not find container \"cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089\": container with ID starting with cfd4c41cd627e5fb0b8ff6cae173d2ecab7bc8c69c6ed7ba5ce6d9cd343c2089 not found: ID does not exist" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.346596 4658 scope.go:117] "RemoveContainer" containerID="bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6" Oct 02 12:15:49 crc kubenswrapper[4658]: E1002 12:15:48.346878 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6\": container with ID starting with bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6 not found: ID does not exist" containerID="bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.346899 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6"} err="failed to get container status \"bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6\": rpc error: code = NotFound desc = could not find container \"bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6\": container with ID starting with bef5586be5c7a4df2926d6cc27d856252add275c2d28416d44e7575f150521f6 not found: ID does not exist" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.778263 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6457197b-c482-42cc-b5af-ec5ac57e1042" (UID: "6457197b-c482-42cc-b5af-ec5ac57e1042"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.841048 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6457197b-c482-42cc-b5af-ec5ac57e1042-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.861006 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zn8nc"] Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:48.871319 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zn8nc"] Oct 02 12:15:49 crc kubenswrapper[4658]: I1002 12:15:49.965801 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6457197b-c482-42cc-b5af-ec5ac57e1042" path="/var/lib/kubelet/pods/6457197b-c482-42cc-b5af-ec5ac57e1042/volumes" Oct 02 12:15:57 crc kubenswrapper[4658]: I1002 12:15:57.430081 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:15:57 crc kubenswrapper[4658]: I1002 12:15:57.430606 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:16:27 crc kubenswrapper[4658]: I1002 12:16:27.429808 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:16:27 crc kubenswrapper[4658]: I1002 12:16:27.431486 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:16:27 crc kubenswrapper[4658]: I1002 12:16:27.431614 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:16:27 crc kubenswrapper[4658]: I1002 12:16:27.432497 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6e3d7f53f7f649211086bc471036b0b4ebd0378bc6474fc30dc0f2fa04fc98b"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:16:27 crc kubenswrapper[4658]: I1002 12:16:27.432651 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://b6e3d7f53f7f649211086bc471036b0b4ebd0378bc6474fc30dc0f2fa04fc98b" gracePeriod=600 Oct 02 12:16:27 crc kubenswrapper[4658]: I1002 12:16:27.608603 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="b6e3d7f53f7f649211086bc471036b0b4ebd0378bc6474fc30dc0f2fa04fc98b" exitCode=0 Oct 02 12:16:27 crc kubenswrapper[4658]: I1002 12:16:27.608642 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"b6e3d7f53f7f649211086bc471036b0b4ebd0378bc6474fc30dc0f2fa04fc98b"} Oct 02 12:16:27 crc kubenswrapper[4658]: I1002 12:16:27.608676 4658 scope.go:117] "RemoveContainer" containerID="804df71e90164f42d2f5f03ab224fb99a6506ae32d99c010f79c16eff7e9e1af" Oct 02 12:16:28 crc kubenswrapper[4658]: I1002 12:16:28.622096 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d"} Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.542130 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5z94s"] Oct 02 12:16:51 crc kubenswrapper[4658]: E1002 12:16:51.543401 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerName="extract-content" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.543424 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerName="extract-content" Oct 02 12:16:51 crc kubenswrapper[4658]: E1002 12:16:51.543443 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerName="extract-utilities" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.543456 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerName="extract-utilities" Oct 02 12:16:51 crc kubenswrapper[4658]: E1002 12:16:51.543483 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerName="registry-server" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.543497 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerName="registry-server" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.543820 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="6457197b-c482-42cc-b5af-ec5ac57e1042" containerName="registry-server" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.545685 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.576812 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z94s"] Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.693833 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v48db\" (UniqueName: \"kubernetes.io/projected/9bf390e2-816d-49b0-953f-afcecc09cd26-kube-api-access-v48db\") pod \"redhat-marketplace-5z94s\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.693913 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-catalog-content\") pod \"redhat-marketplace-5z94s\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.693930 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-utilities\") pod \"redhat-marketplace-5z94s\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.795820 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-catalog-content\") pod \"redhat-marketplace-5z94s\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.795884 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-utilities\") pod \"redhat-marketplace-5z94s\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.796046 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v48db\" (UniqueName: \"kubernetes.io/projected/9bf390e2-816d-49b0-953f-afcecc09cd26-kube-api-access-v48db\") pod \"redhat-marketplace-5z94s\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.796458 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-catalog-content\") pod \"redhat-marketplace-5z94s\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.796599 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-utilities\") pod \"redhat-marketplace-5z94s\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.829170 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v48db\" (UniqueName: \"kubernetes.io/projected/9bf390e2-816d-49b0-953f-afcecc09cd26-kube-api-access-v48db\") pod \"redhat-marketplace-5z94s\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:51 crc kubenswrapper[4658]: I1002 12:16:51.871846 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:16:52 crc kubenswrapper[4658]: I1002 12:16:52.385606 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z94s"] Oct 02 12:16:52 crc kubenswrapper[4658]: I1002 12:16:52.869752 4658 generic.go:334] "Generic (PLEG): container finished" podID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerID="063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79" exitCode=0 Oct 02 12:16:52 crc kubenswrapper[4658]: I1002 12:16:52.870058 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z94s" event={"ID":"9bf390e2-816d-49b0-953f-afcecc09cd26","Type":"ContainerDied","Data":"063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79"} Oct 02 12:16:52 crc kubenswrapper[4658]: I1002 12:16:52.870197 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z94s" event={"ID":"9bf390e2-816d-49b0-953f-afcecc09cd26","Type":"ContainerStarted","Data":"54b5119641a55b7aac0bab10f6529a49d6e318d2ff6a7d40b62666ef0d881b84"} Oct 02 12:16:53 crc kubenswrapper[4658]: I1002 12:16:53.881397 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z94s" event={"ID":"9bf390e2-816d-49b0-953f-afcecc09cd26","Type":"ContainerStarted","Data":"32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7"} Oct 02 12:16:54 crc kubenswrapper[4658]: I1002 12:16:54.893275 4658 generic.go:334] "Generic (PLEG): container finished" podID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerID="32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7" exitCode=0 Oct 02 12:16:54 crc kubenswrapper[4658]: I1002 12:16:54.893387 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z94s" event={"ID":"9bf390e2-816d-49b0-953f-afcecc09cd26","Type":"ContainerDied","Data":"32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7"} Oct 02 12:16:55 crc kubenswrapper[4658]: I1002 12:16:55.908644 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z94s" event={"ID":"9bf390e2-816d-49b0-953f-afcecc09cd26","Type":"ContainerStarted","Data":"a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36"} Oct 02 12:16:55 crc kubenswrapper[4658]: I1002 12:16:55.942490 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5z94s" podStartSLOduration=2.5002735510000003 podStartE2EDuration="4.94247192s" podCreationTimestamp="2025-10-02 12:16:51 +0000 UTC" firstStartedPulling="2025-10-02 12:16:52.873393129 +0000 UTC m=+3493.764546716" lastFinishedPulling="2025-10-02 12:16:55.315591528 +0000 UTC m=+3496.206745085" observedRunningTime="2025-10-02 12:16:55.925727439 +0000 UTC m=+3496.816881016" watchObservedRunningTime="2025-10-02 12:16:55.94247192 +0000 UTC m=+3496.833625487" Oct 02 12:17:01 crc kubenswrapper[4658]: I1002 12:17:01.872633 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:17:01 crc kubenswrapper[4658]: I1002 12:17:01.873270 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:17:01 crc kubenswrapper[4658]: I1002 12:17:01.931314 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:17:02 crc kubenswrapper[4658]: I1002 12:17:02.021056 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:17:02 crc kubenswrapper[4658]: I1002 12:17:02.180544 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z94s"] Oct 02 12:17:03 crc kubenswrapper[4658]: I1002 12:17:03.994319 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5z94s" podUID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerName="registry-server" containerID="cri-o://a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36" gracePeriod=2 Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.510648 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.682235 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-utilities\") pod \"9bf390e2-816d-49b0-953f-afcecc09cd26\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.682346 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-catalog-content\") pod \"9bf390e2-816d-49b0-953f-afcecc09cd26\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.682407 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v48db\" (UniqueName: \"kubernetes.io/projected/9bf390e2-816d-49b0-953f-afcecc09cd26-kube-api-access-v48db\") pod \"9bf390e2-816d-49b0-953f-afcecc09cd26\" (UID: \"9bf390e2-816d-49b0-953f-afcecc09cd26\") " Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.683234 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-utilities" (OuterVolumeSpecName: "utilities") pod "9bf390e2-816d-49b0-953f-afcecc09cd26" (UID: "9bf390e2-816d-49b0-953f-afcecc09cd26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.691887 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bf390e2-816d-49b0-953f-afcecc09cd26-kube-api-access-v48db" (OuterVolumeSpecName: "kube-api-access-v48db") pod "9bf390e2-816d-49b0-953f-afcecc09cd26" (UID: "9bf390e2-816d-49b0-953f-afcecc09cd26"). InnerVolumeSpecName "kube-api-access-v48db". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.695371 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bf390e2-816d-49b0-953f-afcecc09cd26" (UID: "9bf390e2-816d-49b0-953f-afcecc09cd26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.784734 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.784781 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bf390e2-816d-49b0-953f-afcecc09cd26-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:17:04 crc kubenswrapper[4658]: I1002 12:17:04.784801 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v48db\" (UniqueName: \"kubernetes.io/projected/9bf390e2-816d-49b0-953f-afcecc09cd26-kube-api-access-v48db\") on node \"crc\" DevicePath \"\"" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.016224 4658 generic.go:334] "Generic (PLEG): container finished" podID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerID="a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36" exitCode=0 Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.016285 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z94s" event={"ID":"9bf390e2-816d-49b0-953f-afcecc09cd26","Type":"ContainerDied","Data":"a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36"} Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.016358 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z94s" event={"ID":"9bf390e2-816d-49b0-953f-afcecc09cd26","Type":"ContainerDied","Data":"54b5119641a55b7aac0bab10f6529a49d6e318d2ff6a7d40b62666ef0d881b84"} Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.016382 4658 scope.go:117] "RemoveContainer" containerID="a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.016723 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z94s" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.046119 4658 scope.go:117] "RemoveContainer" containerID="32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.079982 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z94s"] Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.082689 4658 scope.go:117] "RemoveContainer" containerID="063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.088164 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z94s"] Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.124113 4658 scope.go:117] "RemoveContainer" containerID="a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36" Oct 02 12:17:05 crc kubenswrapper[4658]: E1002 12:17:05.124570 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36\": container with ID starting with a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36 not found: ID does not exist" containerID="a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.124598 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36"} err="failed to get container status \"a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36\": rpc error: code = NotFound desc = could not find container \"a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36\": container with ID starting with a284bd98fe6a06b3afaf2dbb0e9d8fe3e46e82452c8c35209cb0c401ea1d9e36 not found: ID does not exist" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.124616 4658 scope.go:117] "RemoveContainer" containerID="32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7" Oct 02 12:17:05 crc kubenswrapper[4658]: E1002 12:17:05.125062 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7\": container with ID starting with 32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7 not found: ID does not exist" containerID="32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.125112 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7"} err="failed to get container status \"32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7\": rpc error: code = NotFound desc = could not find container \"32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7\": container with ID starting with 32afff2300ed1f03c8d1b9a5029dc4c9614623f0d6ec2d2779518fc8e8a5d6e7 not found: ID does not exist" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.125146 4658 scope.go:117] "RemoveContainer" containerID="063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79" Oct 02 12:17:05 crc kubenswrapper[4658]: E1002 12:17:05.125486 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79\": container with ID starting with 063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79 not found: ID does not exist" containerID="063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.125510 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79"} err="failed to get container status \"063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79\": rpc error: code = NotFound desc = could not find container \"063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79\": container with ID starting with 063559f29d03ef5cd0dae5a6af6f5037dad7720ea732f63549c104922c819c79 not found: ID does not exist" Oct 02 12:17:05 crc kubenswrapper[4658]: I1002 12:17:05.968604 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bf390e2-816d-49b0-953f-afcecc09cd26" path="/var/lib/kubelet/pods/9bf390e2-816d-49b0-953f-afcecc09cd26/volumes" Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.791418 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wgxn2"] Oct 02 12:17:57 crc kubenswrapper[4658]: E1002 12:17:57.792244 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerName="extract-utilities" Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.792260 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerName="extract-utilities" Oct 02 12:17:57 crc kubenswrapper[4658]: E1002 12:17:57.792331 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerName="extract-content" Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.792342 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerName="extract-content" Oct 02 12:17:57 crc kubenswrapper[4658]: E1002 12:17:57.792360 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerName="registry-server" Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.792369 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerName="registry-server" Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.792619 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bf390e2-816d-49b0-953f-afcecc09cd26" containerName="registry-server" Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.794026 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.805947 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgxn2"] Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.914246 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwjxh\" (UniqueName: \"kubernetes.io/projected/25a208ba-cb56-48b1-812a-63b79df24726-kube-api-access-mwjxh\") pod \"redhat-operators-wgxn2\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.914344 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-catalog-content\") pod \"redhat-operators-wgxn2\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:57 crc kubenswrapper[4658]: I1002 12:17:57.914365 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-utilities\") pod \"redhat-operators-wgxn2\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:58 crc kubenswrapper[4658]: I1002 12:17:58.016026 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwjxh\" (UniqueName: \"kubernetes.io/projected/25a208ba-cb56-48b1-812a-63b79df24726-kube-api-access-mwjxh\") pod \"redhat-operators-wgxn2\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:58 crc kubenswrapper[4658]: I1002 12:17:58.016433 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-catalog-content\") pod \"redhat-operators-wgxn2\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:58 crc kubenswrapper[4658]: I1002 12:17:58.016458 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-utilities\") pod \"redhat-operators-wgxn2\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:58 crc kubenswrapper[4658]: I1002 12:17:58.016937 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-catalog-content\") pod \"redhat-operators-wgxn2\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:58 crc kubenswrapper[4658]: I1002 12:17:58.017086 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-utilities\") pod \"redhat-operators-wgxn2\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:58 crc kubenswrapper[4658]: I1002 12:17:58.037412 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwjxh\" (UniqueName: \"kubernetes.io/projected/25a208ba-cb56-48b1-812a-63b79df24726-kube-api-access-mwjxh\") pod \"redhat-operators-wgxn2\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:58 crc kubenswrapper[4658]: I1002 12:17:58.117586 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:17:58 crc kubenswrapper[4658]: I1002 12:17:58.623182 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgxn2"] Oct 02 12:17:59 crc kubenswrapper[4658]: I1002 12:17:59.558046 4658 generic.go:334] "Generic (PLEG): container finished" podID="25a208ba-cb56-48b1-812a-63b79df24726" containerID="0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687" exitCode=0 Oct 02 12:17:59 crc kubenswrapper[4658]: I1002 12:17:59.558125 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgxn2" event={"ID":"25a208ba-cb56-48b1-812a-63b79df24726","Type":"ContainerDied","Data":"0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687"} Oct 02 12:17:59 crc kubenswrapper[4658]: I1002 12:17:59.558403 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgxn2" event={"ID":"25a208ba-cb56-48b1-812a-63b79df24726","Type":"ContainerStarted","Data":"714b7b7a2114dce3f81afe80867f460684180b7dae2dacbf2ba73bf8f34a7ea2"} Oct 02 12:18:01 crc kubenswrapper[4658]: I1002 12:18:01.579335 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgxn2" event={"ID":"25a208ba-cb56-48b1-812a-63b79df24726","Type":"ContainerStarted","Data":"fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6"} Oct 02 12:18:03 crc kubenswrapper[4658]: I1002 12:18:03.607349 4658 generic.go:334] "Generic (PLEG): container finished" podID="25a208ba-cb56-48b1-812a-63b79df24726" containerID="fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6" exitCode=0 Oct 02 12:18:03 crc kubenswrapper[4658]: I1002 12:18:03.607405 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgxn2" event={"ID":"25a208ba-cb56-48b1-812a-63b79df24726","Type":"ContainerDied","Data":"fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6"} Oct 02 12:18:04 crc kubenswrapper[4658]: I1002 12:18:04.619732 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgxn2" event={"ID":"25a208ba-cb56-48b1-812a-63b79df24726","Type":"ContainerStarted","Data":"9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72"} Oct 02 12:18:08 crc kubenswrapper[4658]: I1002 12:18:08.118508 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:18:08 crc kubenswrapper[4658]: I1002 12:18:08.118849 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:18:09 crc kubenswrapper[4658]: I1002 12:18:09.170366 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wgxn2" podUID="25a208ba-cb56-48b1-812a-63b79df24726" containerName="registry-server" probeResult="failure" output=< Oct 02 12:18:09 crc kubenswrapper[4658]: timeout: failed to connect service ":50051" within 1s Oct 02 12:18:09 crc kubenswrapper[4658]: > Oct 02 12:18:18 crc kubenswrapper[4658]: I1002 12:18:18.168020 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:18:18 crc kubenswrapper[4658]: I1002 12:18:18.189136 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wgxn2" podStartSLOduration=16.362613428 podStartE2EDuration="21.189110788s" podCreationTimestamp="2025-10-02 12:17:57 +0000 UTC" firstStartedPulling="2025-10-02 12:17:59.559746656 +0000 UTC m=+3560.450900223" lastFinishedPulling="2025-10-02 12:18:04.386244016 +0000 UTC m=+3565.277397583" observedRunningTime="2025-10-02 12:18:04.647766066 +0000 UTC m=+3565.538919633" watchObservedRunningTime="2025-10-02 12:18:18.189110788 +0000 UTC m=+3579.080264355" Oct 02 12:18:18 crc kubenswrapper[4658]: I1002 12:18:18.223788 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:18:18 crc kubenswrapper[4658]: I1002 12:18:18.417779 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgxn2"] Oct 02 12:18:19 crc kubenswrapper[4658]: I1002 12:18:19.768171 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wgxn2" podUID="25a208ba-cb56-48b1-812a-63b79df24726" containerName="registry-server" containerID="cri-o://9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72" gracePeriod=2 Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.299502 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.403239 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-utilities\") pod \"25a208ba-cb56-48b1-812a-63b79df24726\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.403399 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-catalog-content\") pod \"25a208ba-cb56-48b1-812a-63b79df24726\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.403627 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwjxh\" (UniqueName: \"kubernetes.io/projected/25a208ba-cb56-48b1-812a-63b79df24726-kube-api-access-mwjxh\") pod \"25a208ba-cb56-48b1-812a-63b79df24726\" (UID: \"25a208ba-cb56-48b1-812a-63b79df24726\") " Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.404991 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-utilities" (OuterVolumeSpecName: "utilities") pod "25a208ba-cb56-48b1-812a-63b79df24726" (UID: "25a208ba-cb56-48b1-812a-63b79df24726"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.409365 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a208ba-cb56-48b1-812a-63b79df24726-kube-api-access-mwjxh" (OuterVolumeSpecName: "kube-api-access-mwjxh") pod "25a208ba-cb56-48b1-812a-63b79df24726" (UID: "25a208ba-cb56-48b1-812a-63b79df24726"). InnerVolumeSpecName "kube-api-access-mwjxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.480612 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25a208ba-cb56-48b1-812a-63b79df24726" (UID: "25a208ba-cb56-48b1-812a-63b79df24726"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.506412 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.506451 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25a208ba-cb56-48b1-812a-63b79df24726-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.506465 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwjxh\" (UniqueName: \"kubernetes.io/projected/25a208ba-cb56-48b1-812a-63b79df24726-kube-api-access-mwjxh\") on node \"crc\" DevicePath \"\"" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.781878 4658 generic.go:334] "Generic (PLEG): container finished" podID="25a208ba-cb56-48b1-812a-63b79df24726" containerID="9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72" exitCode=0 Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.782026 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgxn2" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.782096 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgxn2" event={"ID":"25a208ba-cb56-48b1-812a-63b79df24726","Type":"ContainerDied","Data":"9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72"} Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.782762 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgxn2" event={"ID":"25a208ba-cb56-48b1-812a-63b79df24726","Type":"ContainerDied","Data":"714b7b7a2114dce3f81afe80867f460684180b7dae2dacbf2ba73bf8f34a7ea2"} Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.782804 4658 scope.go:117] "RemoveContainer" containerID="9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.813098 4658 scope.go:117] "RemoveContainer" containerID="fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.841458 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgxn2"] Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.850605 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wgxn2"] Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.854187 4658 scope.go:117] "RemoveContainer" containerID="0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.897257 4658 scope.go:117] "RemoveContainer" containerID="9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72" Oct 02 12:18:20 crc kubenswrapper[4658]: E1002 12:18:20.897872 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72\": container with ID starting with 9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72 not found: ID does not exist" containerID="9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.897928 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72"} err="failed to get container status \"9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72\": rpc error: code = NotFound desc = could not find container \"9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72\": container with ID starting with 9feb47fe5a20db575b17358361d579130f5e3ab4487851991951932c300f0c72 not found: ID does not exist" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.897962 4658 scope.go:117] "RemoveContainer" containerID="fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6" Oct 02 12:18:20 crc kubenswrapper[4658]: E1002 12:18:20.898934 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6\": container with ID starting with fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6 not found: ID does not exist" containerID="fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.898978 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6"} err="failed to get container status \"fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6\": rpc error: code = NotFound desc = could not find container \"fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6\": container with ID starting with fa3581d013b3dfb89c7e6f9fa1c2fe18707e9eb04e00a403ba21c8dfa8f063e6 not found: ID does not exist" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.899005 4658 scope.go:117] "RemoveContainer" containerID="0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687" Oct 02 12:18:20 crc kubenswrapper[4658]: E1002 12:18:20.899395 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687\": container with ID starting with 0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687 not found: ID does not exist" containerID="0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687" Oct 02 12:18:20 crc kubenswrapper[4658]: I1002 12:18:20.899429 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687"} err="failed to get container status \"0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687\": rpc error: code = NotFound desc = could not find container \"0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687\": container with ID starting with 0b452cbba3bda6c32743f0ab9706a199c83bdcc644f02fff07a79387b9f2a687 not found: ID does not exist" Oct 02 12:18:21 crc kubenswrapper[4658]: I1002 12:18:21.962926 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a208ba-cb56-48b1-812a-63b79df24726" path="/var/lib/kubelet/pods/25a208ba-cb56-48b1-812a-63b79df24726/volumes" Oct 02 12:18:27 crc kubenswrapper[4658]: I1002 12:18:27.429888 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:18:27 crc kubenswrapper[4658]: I1002 12:18:27.430938 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:18:57 crc kubenswrapper[4658]: I1002 12:18:57.429948 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:18:57 crc kubenswrapper[4658]: I1002 12:18:57.430650 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:19:27 crc kubenswrapper[4658]: I1002 12:19:27.430079 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:19:27 crc kubenswrapper[4658]: I1002 12:19:27.430568 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:19:27 crc kubenswrapper[4658]: I1002 12:19:27.430611 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:19:27 crc kubenswrapper[4658]: I1002 12:19:27.431423 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:19:27 crc kubenswrapper[4658]: I1002 12:19:27.431476 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" gracePeriod=600 Oct 02 12:19:27 crc kubenswrapper[4658]: E1002 12:19:27.560898 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:19:28 crc kubenswrapper[4658]: I1002 12:19:28.446213 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" exitCode=0 Oct 02 12:19:28 crc kubenswrapper[4658]: I1002 12:19:28.446418 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d"} Oct 02 12:19:28 crc kubenswrapper[4658]: I1002 12:19:28.446764 4658 scope.go:117] "RemoveContainer" containerID="b6e3d7f53f7f649211086bc471036b0b4ebd0378bc6474fc30dc0f2fa04fc98b" Oct 02 12:19:28 crc kubenswrapper[4658]: I1002 12:19:28.447489 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:19:28 crc kubenswrapper[4658]: E1002 12:19:28.447816 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:19:43 crc kubenswrapper[4658]: I1002 12:19:43.948943 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:19:43 crc kubenswrapper[4658]: E1002 12:19:43.949854 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:19:54 crc kubenswrapper[4658]: I1002 12:19:54.949340 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:19:54 crc kubenswrapper[4658]: E1002 12:19:54.950452 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:20:08 crc kubenswrapper[4658]: I1002 12:20:08.948935 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:20:08 crc kubenswrapper[4658]: E1002 12:20:08.949824 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:20:19 crc kubenswrapper[4658]: I1002 12:20:19.961680 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:20:19 crc kubenswrapper[4658]: E1002 12:20:19.962570 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:20:30 crc kubenswrapper[4658]: I1002 12:20:30.948941 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:20:30 crc kubenswrapper[4658]: E1002 12:20:30.949665 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:20:44 crc kubenswrapper[4658]: I1002 12:20:44.950202 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:20:44 crc kubenswrapper[4658]: E1002 12:20:44.951111 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:20:56 crc kubenswrapper[4658]: I1002 12:20:56.949203 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:20:56 crc kubenswrapper[4658]: E1002 12:20:56.950138 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:21:10 crc kubenswrapper[4658]: I1002 12:21:10.951053 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:21:10 crc kubenswrapper[4658]: E1002 12:21:10.952527 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:21:22 crc kubenswrapper[4658]: I1002 12:21:22.950411 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:21:22 crc kubenswrapper[4658]: E1002 12:21:22.951190 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:21:35 crc kubenswrapper[4658]: I1002 12:21:35.949916 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:21:35 crc kubenswrapper[4658]: E1002 12:21:35.950623 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:21:35 crc kubenswrapper[4658]: I1002 12:21:35.986647 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sdftx"] Oct 02 12:21:35 crc kubenswrapper[4658]: E1002 12:21:35.987152 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a208ba-cb56-48b1-812a-63b79df24726" containerName="registry-server" Oct 02 12:21:35 crc kubenswrapper[4658]: I1002 12:21:35.987174 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a208ba-cb56-48b1-812a-63b79df24726" containerName="registry-server" Oct 02 12:21:35 crc kubenswrapper[4658]: E1002 12:21:35.987217 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a208ba-cb56-48b1-812a-63b79df24726" containerName="extract-content" Oct 02 12:21:35 crc kubenswrapper[4658]: I1002 12:21:35.987227 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a208ba-cb56-48b1-812a-63b79df24726" containerName="extract-content" Oct 02 12:21:35 crc kubenswrapper[4658]: E1002 12:21:35.987245 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a208ba-cb56-48b1-812a-63b79df24726" containerName="extract-utilities" Oct 02 12:21:35 crc kubenswrapper[4658]: I1002 12:21:35.987253 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a208ba-cb56-48b1-812a-63b79df24726" containerName="extract-utilities" Oct 02 12:21:35 crc kubenswrapper[4658]: I1002 12:21:35.987522 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a208ba-cb56-48b1-812a-63b79df24726" containerName="registry-server" Oct 02 12:21:35 crc kubenswrapper[4658]: I1002 12:21:35.990432 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:35 crc kubenswrapper[4658]: I1002 12:21:35.996718 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdftx"] Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.023445 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5hws\" (UniqueName: \"kubernetes.io/projected/17289561-86e4-481c-a046-6bcd38124f5f-kube-api-access-p5hws\") pod \"community-operators-sdftx\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.024542 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-catalog-content\") pod \"community-operators-sdftx\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.024628 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-utilities\") pod \"community-operators-sdftx\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.125740 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5hws\" (UniqueName: \"kubernetes.io/projected/17289561-86e4-481c-a046-6bcd38124f5f-kube-api-access-p5hws\") pod \"community-operators-sdftx\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.125914 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-catalog-content\") pod \"community-operators-sdftx\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.125947 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-utilities\") pod \"community-operators-sdftx\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.126567 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-utilities\") pod \"community-operators-sdftx\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.126582 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-catalog-content\") pod \"community-operators-sdftx\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.152123 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5hws\" (UniqueName: \"kubernetes.io/projected/17289561-86e4-481c-a046-6bcd38124f5f-kube-api-access-p5hws\") pod \"community-operators-sdftx\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.334826 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:36 crc kubenswrapper[4658]: I1002 12:21:36.889743 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdftx"] Oct 02 12:21:36 crc kubenswrapper[4658]: W1002 12:21:36.896757 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17289561_86e4_481c_a046_6bcd38124f5f.slice/crio-ed80d71cca2b4f596879ddfa3f8a00fb782eabce108cf0b9a5d4c7b6dea04fe9 WatchSource:0}: Error finding container ed80d71cca2b4f596879ddfa3f8a00fb782eabce108cf0b9a5d4c7b6dea04fe9: Status 404 returned error can't find the container with id ed80d71cca2b4f596879ddfa3f8a00fb782eabce108cf0b9a5d4c7b6dea04fe9 Oct 02 12:21:37 crc kubenswrapper[4658]: I1002 12:21:37.723607 4658 generic.go:334] "Generic (PLEG): container finished" podID="17289561-86e4-481c-a046-6bcd38124f5f" containerID="745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24" exitCode=0 Oct 02 12:21:37 crc kubenswrapper[4658]: I1002 12:21:37.723882 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdftx" event={"ID":"17289561-86e4-481c-a046-6bcd38124f5f","Type":"ContainerDied","Data":"745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24"} Oct 02 12:21:37 crc kubenswrapper[4658]: I1002 12:21:37.723930 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdftx" event={"ID":"17289561-86e4-481c-a046-6bcd38124f5f","Type":"ContainerStarted","Data":"ed80d71cca2b4f596879ddfa3f8a00fb782eabce108cf0b9a5d4c7b6dea04fe9"} Oct 02 12:21:37 crc kubenswrapper[4658]: I1002 12:21:37.726513 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:21:38 crc kubenswrapper[4658]: I1002 12:21:38.734405 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdftx" event={"ID":"17289561-86e4-481c-a046-6bcd38124f5f","Type":"ContainerStarted","Data":"75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2"} Oct 02 12:21:39 crc kubenswrapper[4658]: I1002 12:21:39.750225 4658 generic.go:334] "Generic (PLEG): container finished" podID="17289561-86e4-481c-a046-6bcd38124f5f" containerID="75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2" exitCode=0 Oct 02 12:21:39 crc kubenswrapper[4658]: I1002 12:21:39.750348 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdftx" event={"ID":"17289561-86e4-481c-a046-6bcd38124f5f","Type":"ContainerDied","Data":"75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2"} Oct 02 12:21:40 crc kubenswrapper[4658]: I1002 12:21:40.764450 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdftx" event={"ID":"17289561-86e4-481c-a046-6bcd38124f5f","Type":"ContainerStarted","Data":"b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216"} Oct 02 12:21:40 crc kubenswrapper[4658]: I1002 12:21:40.783917 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sdftx" podStartSLOduration=3.384178885 podStartE2EDuration="5.783897904s" podCreationTimestamp="2025-10-02 12:21:35 +0000 UTC" firstStartedPulling="2025-10-02 12:21:37.726230881 +0000 UTC m=+3778.617384448" lastFinishedPulling="2025-10-02 12:21:40.1259499 +0000 UTC m=+3781.017103467" observedRunningTime="2025-10-02 12:21:40.781668493 +0000 UTC m=+3781.672822060" watchObservedRunningTime="2025-10-02 12:21:40.783897904 +0000 UTC m=+3781.675051471" Oct 02 12:21:46 crc kubenswrapper[4658]: I1002 12:21:46.335928 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:46 crc kubenswrapper[4658]: I1002 12:21:46.336478 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:46 crc kubenswrapper[4658]: I1002 12:21:46.411011 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:46 crc kubenswrapper[4658]: I1002 12:21:46.875394 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:46 crc kubenswrapper[4658]: I1002 12:21:46.928675 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdftx"] Oct 02 12:21:48 crc kubenswrapper[4658]: I1002 12:21:48.844860 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sdftx" podUID="17289561-86e4-481c-a046-6bcd38124f5f" containerName="registry-server" containerID="cri-o://b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216" gracePeriod=2 Oct 02 12:21:48 crc kubenswrapper[4658]: I1002 12:21:48.949496 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:21:48 crc kubenswrapper[4658]: E1002 12:21:48.949830 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.320910 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.491556 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-catalog-content\") pod \"17289561-86e4-481c-a046-6bcd38124f5f\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.491724 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5hws\" (UniqueName: \"kubernetes.io/projected/17289561-86e4-481c-a046-6bcd38124f5f-kube-api-access-p5hws\") pod \"17289561-86e4-481c-a046-6bcd38124f5f\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.492314 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-utilities\") pod \"17289561-86e4-481c-a046-6bcd38124f5f\" (UID: \"17289561-86e4-481c-a046-6bcd38124f5f\") " Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.493322 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-utilities" (OuterVolumeSpecName: "utilities") pod "17289561-86e4-481c-a046-6bcd38124f5f" (UID: "17289561-86e4-481c-a046-6bcd38124f5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.499867 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17289561-86e4-481c-a046-6bcd38124f5f-kube-api-access-p5hws" (OuterVolumeSpecName: "kube-api-access-p5hws") pod "17289561-86e4-481c-a046-6bcd38124f5f" (UID: "17289561-86e4-481c-a046-6bcd38124f5f"). InnerVolumeSpecName "kube-api-access-p5hws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.545511 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17289561-86e4-481c-a046-6bcd38124f5f" (UID: "17289561-86e4-481c-a046-6bcd38124f5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.594804 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.594841 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17289561-86e4-481c-a046-6bcd38124f5f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.594853 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5hws\" (UniqueName: \"kubernetes.io/projected/17289561-86e4-481c-a046-6bcd38124f5f-kube-api-access-p5hws\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.858433 4658 generic.go:334] "Generic (PLEG): container finished" podID="17289561-86e4-481c-a046-6bcd38124f5f" containerID="b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216" exitCode=0 Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.858545 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdftx" event={"ID":"17289561-86e4-481c-a046-6bcd38124f5f","Type":"ContainerDied","Data":"b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216"} Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.858590 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdftx" event={"ID":"17289561-86e4-481c-a046-6bcd38124f5f","Type":"ContainerDied","Data":"ed80d71cca2b4f596879ddfa3f8a00fb782eabce108cf0b9a5d4c7b6dea04fe9"} Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.858609 4658 scope.go:117] "RemoveContainer" containerID="b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.858791 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdftx" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.896057 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdftx"] Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.904413 4658 scope.go:117] "RemoveContainer" containerID="75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.906929 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sdftx"] Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.927108 4658 scope.go:117] "RemoveContainer" containerID="745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.971468 4658 scope.go:117] "RemoveContainer" containerID="b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216" Oct 02 12:21:49 crc kubenswrapper[4658]: E1002 12:21:49.971873 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216\": container with ID starting with b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216 not found: ID does not exist" containerID="b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.971918 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216"} err="failed to get container status \"b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216\": rpc error: code = NotFound desc = could not find container \"b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216\": container with ID starting with b6d28d1e1482e6efc5255121600c9522ac74f1558a1a59dc74d61f4a7051e216 not found: ID does not exist" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.971941 4658 scope.go:117] "RemoveContainer" containerID="75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2" Oct 02 12:21:49 crc kubenswrapper[4658]: E1002 12:21:49.972199 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2\": container with ID starting with 75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2 not found: ID does not exist" containerID="75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.972233 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2"} err="failed to get container status \"75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2\": rpc error: code = NotFound desc = could not find container \"75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2\": container with ID starting with 75a53621386f65bb5629ad4846f4de285d9314b87784f815824338a4401133c2 not found: ID does not exist" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.972253 4658 scope.go:117] "RemoveContainer" containerID="745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24" Oct 02 12:21:49 crc kubenswrapper[4658]: E1002 12:21:49.972481 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24\": container with ID starting with 745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24 not found: ID does not exist" containerID="745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.972506 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24"} err="failed to get container status \"745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24\": rpc error: code = NotFound desc = could not find container \"745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24\": container with ID starting with 745ba6c234fc9c157d807085f0848babd4e133ac8bc8a9eb4e66c0f2a320ab24 not found: ID does not exist" Oct 02 12:21:49 crc kubenswrapper[4658]: I1002 12:21:49.977660 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17289561-86e4-481c-a046-6bcd38124f5f" path="/var/lib/kubelet/pods/17289561-86e4-481c-a046-6bcd38124f5f/volumes" Oct 02 12:22:02 crc kubenswrapper[4658]: I1002 12:22:02.949390 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:22:02 crc kubenswrapper[4658]: E1002 12:22:02.950542 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:22:16 crc kubenswrapper[4658]: I1002 12:22:16.949431 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:22:16 crc kubenswrapper[4658]: E1002 12:22:16.950272 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:22:30 crc kubenswrapper[4658]: I1002 12:22:30.949702 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:22:30 crc kubenswrapper[4658]: E1002 12:22:30.950560 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:22:42 crc kubenswrapper[4658]: I1002 12:22:42.949467 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:22:42 crc kubenswrapper[4658]: E1002 12:22:42.950346 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:22:54 crc kubenswrapper[4658]: I1002 12:22:54.949103 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:22:54 crc kubenswrapper[4658]: E1002 12:22:54.950052 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:23:09 crc kubenswrapper[4658]: I1002 12:23:09.956996 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:23:09 crc kubenswrapper[4658]: E1002 12:23:09.957807 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:23:24 crc kubenswrapper[4658]: I1002 12:23:24.949723 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:23:24 crc kubenswrapper[4658]: E1002 12:23:24.950810 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:23:31 crc kubenswrapper[4658]: I1002 12:23:31.673054 4658 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5566488b4c-k88mg" podUID="67435e65-47df-41df-9570-df74c35bd5fc" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 02 12:23:37 crc kubenswrapper[4658]: I1002 12:23:37.951343 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:23:37 crc kubenswrapper[4658]: E1002 12:23:37.952529 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:23:50 crc kubenswrapper[4658]: I1002 12:23:50.949802 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:23:50 crc kubenswrapper[4658]: E1002 12:23:50.950550 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:24:03 crc kubenswrapper[4658]: I1002 12:24:03.950651 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:24:03 crc kubenswrapper[4658]: E1002 12:24:03.951454 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:24:15 crc kubenswrapper[4658]: I1002 12:24:15.949125 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:24:15 crc kubenswrapper[4658]: E1002 12:24:15.949952 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:24:26 crc kubenswrapper[4658]: I1002 12:24:26.949374 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:24:26 crc kubenswrapper[4658]: E1002 12:24:26.950229 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:24:41 crc kubenswrapper[4658]: I1002 12:24:41.950070 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:24:42 crc kubenswrapper[4658]: I1002 12:24:42.557063 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"ca7bac88d9b7f9e81ba33b516d70177becc7c1be6f07ae1bf2b99a4256cb835e"} Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.030849 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mm7ql"] Oct 02 12:25:44 crc kubenswrapper[4658]: E1002 12:25:44.031957 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17289561-86e4-481c-a046-6bcd38124f5f" containerName="registry-server" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.031972 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="17289561-86e4-481c-a046-6bcd38124f5f" containerName="registry-server" Oct 02 12:25:44 crc kubenswrapper[4658]: E1002 12:25:44.031990 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17289561-86e4-481c-a046-6bcd38124f5f" containerName="extract-utilities" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.031998 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="17289561-86e4-481c-a046-6bcd38124f5f" containerName="extract-utilities" Oct 02 12:25:44 crc kubenswrapper[4658]: E1002 12:25:44.032013 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17289561-86e4-481c-a046-6bcd38124f5f" containerName="extract-content" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.032019 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="17289561-86e4-481c-a046-6bcd38124f5f" containerName="extract-content" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.032228 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="17289561-86e4-481c-a046-6bcd38124f5f" containerName="registry-server" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.033667 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.040969 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mm7ql"] Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.205322 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41a29d7-3972-41e5-9ab4-fd44f44bc184-catalog-content\") pod \"certified-operators-mm7ql\" (UID: \"d41a29d7-3972-41e5-9ab4-fd44f44bc184\") " pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.205679 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41a29d7-3972-41e5-9ab4-fd44f44bc184-utilities\") pod \"certified-operators-mm7ql\" (UID: \"d41a29d7-3972-41e5-9ab4-fd44f44bc184\") " pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.206699 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fbtd\" (UniqueName: \"kubernetes.io/projected/d41a29d7-3972-41e5-9ab4-fd44f44bc184-kube-api-access-7fbtd\") pod \"certified-operators-mm7ql\" (UID: \"d41a29d7-3972-41e5-9ab4-fd44f44bc184\") " pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.308223 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41a29d7-3972-41e5-9ab4-fd44f44bc184-catalog-content\") pod \"certified-operators-mm7ql\" (UID: \"d41a29d7-3972-41e5-9ab4-fd44f44bc184\") " pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.308689 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41a29d7-3972-41e5-9ab4-fd44f44bc184-catalog-content\") pod \"certified-operators-mm7ql\" (UID: \"d41a29d7-3972-41e5-9ab4-fd44f44bc184\") " pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.308703 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41a29d7-3972-41e5-9ab4-fd44f44bc184-utilities\") pod \"certified-operators-mm7ql\" (UID: \"d41a29d7-3972-41e5-9ab4-fd44f44bc184\") " pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.309170 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fbtd\" (UniqueName: \"kubernetes.io/projected/d41a29d7-3972-41e5-9ab4-fd44f44bc184-kube-api-access-7fbtd\") pod \"certified-operators-mm7ql\" (UID: \"d41a29d7-3972-41e5-9ab4-fd44f44bc184\") " pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.309285 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41a29d7-3972-41e5-9ab4-fd44f44bc184-utilities\") pod \"certified-operators-mm7ql\" (UID: \"d41a29d7-3972-41e5-9ab4-fd44f44bc184\") " pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.332889 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fbtd\" (UniqueName: \"kubernetes.io/projected/d41a29d7-3972-41e5-9ab4-fd44f44bc184-kube-api-access-7fbtd\") pod \"certified-operators-mm7ql\" (UID: \"d41a29d7-3972-41e5-9ab4-fd44f44bc184\") " pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.357534 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:44 crc kubenswrapper[4658]: I1002 12:25:44.861366 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mm7ql"] Oct 02 12:25:45 crc kubenswrapper[4658]: I1002 12:25:45.129315 4658 generic.go:334] "Generic (PLEG): container finished" podID="d41a29d7-3972-41e5-9ab4-fd44f44bc184" containerID="df69d2509b0def30e4bb06c439795b58e656e58849bcd4cc689a974dc77aead0" exitCode=0 Oct 02 12:25:45 crc kubenswrapper[4658]: I1002 12:25:45.129493 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm7ql" event={"ID":"d41a29d7-3972-41e5-9ab4-fd44f44bc184","Type":"ContainerDied","Data":"df69d2509b0def30e4bb06c439795b58e656e58849bcd4cc689a974dc77aead0"} Oct 02 12:25:45 crc kubenswrapper[4658]: I1002 12:25:45.129671 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm7ql" event={"ID":"d41a29d7-3972-41e5-9ab4-fd44f44bc184","Type":"ContainerStarted","Data":"dd9d1bbadf47eda2bab2000d8ac0c1e887dbfd01ab23a7542af4bda10b2db024"} Oct 02 12:25:51 crc kubenswrapper[4658]: I1002 12:25:51.196012 4658 generic.go:334] "Generic (PLEG): container finished" podID="d41a29d7-3972-41e5-9ab4-fd44f44bc184" containerID="97a4850107df7d9ddea9e250f01695370c25c3b369e16d52a8ee009f6d994c51" exitCode=0 Oct 02 12:25:51 crc kubenswrapper[4658]: I1002 12:25:51.196089 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm7ql" event={"ID":"d41a29d7-3972-41e5-9ab4-fd44f44bc184","Type":"ContainerDied","Data":"97a4850107df7d9ddea9e250f01695370c25c3b369e16d52a8ee009f6d994c51"} Oct 02 12:25:52 crc kubenswrapper[4658]: I1002 12:25:52.210800 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm7ql" event={"ID":"d41a29d7-3972-41e5-9ab4-fd44f44bc184","Type":"ContainerStarted","Data":"8eb4d96e4d0f45022777549330876b0d9e7828d486117354fbf7a144107e9952"} Oct 02 12:25:52 crc kubenswrapper[4658]: I1002 12:25:52.238574 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mm7ql" podStartSLOduration=1.707583453 podStartE2EDuration="8.238553071s" podCreationTimestamp="2025-10-02 12:25:44 +0000 UTC" firstStartedPulling="2025-10-02 12:25:45.131498428 +0000 UTC m=+4026.022652005" lastFinishedPulling="2025-10-02 12:25:51.662468056 +0000 UTC m=+4032.553621623" observedRunningTime="2025-10-02 12:25:52.233081505 +0000 UTC m=+4033.124235102" watchObservedRunningTime="2025-10-02 12:25:52.238553071 +0000 UTC m=+4033.129706638" Oct 02 12:25:54 crc kubenswrapper[4658]: I1002 12:25:54.358740 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:54 crc kubenswrapper[4658]: I1002 12:25:54.359061 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:25:54 crc kubenswrapper[4658]: I1002 12:25:54.400868 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:26:04 crc kubenswrapper[4658]: I1002 12:26:04.419049 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mm7ql" Oct 02 12:26:07 crc kubenswrapper[4658]: I1002 12:26:07.176079 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mm7ql"] Oct 02 12:26:07 crc kubenswrapper[4658]: I1002 12:26:07.346882 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bgnl"] Oct 02 12:26:07 crc kubenswrapper[4658]: I1002 12:26:07.347121 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2bgnl" podUID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerName="registry-server" containerID="cri-o://30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b" gracePeriod=2 Oct 02 12:26:07 crc kubenswrapper[4658]: I1002 12:26:07.914844 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.020902 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-catalog-content\") pod \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.021057 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-utilities\") pod \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.021471 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2sj4\" (UniqueName: \"kubernetes.io/projected/6c4f50a2-0ec0-44db-9817-8b3116a2415b-kube-api-access-h2sj4\") pod \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\" (UID: \"6c4f50a2-0ec0-44db-9817-8b3116a2415b\") " Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.023063 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-utilities" (OuterVolumeSpecName: "utilities") pod "6c4f50a2-0ec0-44db-9817-8b3116a2415b" (UID: "6c4f50a2-0ec0-44db-9817-8b3116a2415b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.024328 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.029767 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4f50a2-0ec0-44db-9817-8b3116a2415b-kube-api-access-h2sj4" (OuterVolumeSpecName: "kube-api-access-h2sj4") pod "6c4f50a2-0ec0-44db-9817-8b3116a2415b" (UID: "6c4f50a2-0ec0-44db-9817-8b3116a2415b"). InnerVolumeSpecName "kube-api-access-h2sj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.109081 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c4f50a2-0ec0-44db-9817-8b3116a2415b" (UID: "6c4f50a2-0ec0-44db-9817-8b3116a2415b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.125652 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2sj4\" (UniqueName: \"kubernetes.io/projected/6c4f50a2-0ec0-44db-9817-8b3116a2415b-kube-api-access-h2sj4\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.125690 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4f50a2-0ec0-44db-9817-8b3116a2415b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.371283 4658 generic.go:334] "Generic (PLEG): container finished" podID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerID="30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b" exitCode=0 Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.371425 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgnl" event={"ID":"6c4f50a2-0ec0-44db-9817-8b3116a2415b","Type":"ContainerDied","Data":"30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b"} Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.371461 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgnl" event={"ID":"6c4f50a2-0ec0-44db-9817-8b3116a2415b","Type":"ContainerDied","Data":"a6a1449a3faf9a4547b55d926e26c947e9845f849e7ec2a6a50d0113c6bfe7ae"} Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.371482 4658 scope.go:117] "RemoveContainer" containerID="30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.371662 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bgnl" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.412142 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bgnl"] Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.419966 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2bgnl"] Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.422151 4658 scope.go:117] "RemoveContainer" containerID="ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.451848 4658 scope.go:117] "RemoveContainer" containerID="4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.504557 4658 scope.go:117] "RemoveContainer" containerID="30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b" Oct 02 12:26:08 crc kubenswrapper[4658]: E1002 12:26:08.505190 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b\": container with ID starting with 30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b not found: ID does not exist" containerID="30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.505438 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b"} err="failed to get container status \"30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b\": rpc error: code = NotFound desc = could not find container \"30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b\": container with ID starting with 30cf6dcf8aee083b63880209b48753059e40cb952927c7c970411841986fd90b not found: ID does not exist" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.505583 4658 scope.go:117] "RemoveContainer" containerID="ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690" Oct 02 12:26:08 crc kubenswrapper[4658]: E1002 12:26:08.506305 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690\": container with ID starting with ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690 not found: ID does not exist" containerID="ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.506421 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690"} err="failed to get container status \"ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690\": rpc error: code = NotFound desc = could not find container \"ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690\": container with ID starting with ca1673f9d097ac2020067fa329d947eec7eff0af3d522ef10b2e552c64a04690 not found: ID does not exist" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.506525 4658 scope.go:117] "RemoveContainer" containerID="4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f" Oct 02 12:26:08 crc kubenswrapper[4658]: E1002 12:26:08.507106 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f\": container with ID starting with 4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f not found: ID does not exist" containerID="4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f" Oct 02 12:26:08 crc kubenswrapper[4658]: I1002 12:26:08.507150 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f"} err="failed to get container status \"4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f\": rpc error: code = NotFound desc = could not find container \"4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f\": container with ID starting with 4aa02d3263fe8a6cf1de418e4d1e6b4531de2aa07f071c953f40b11d05c13c7f not found: ID does not exist" Oct 02 12:26:09 crc kubenswrapper[4658]: I1002 12:26:09.962534 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" path="/var/lib/kubelet/pods/6c4f50a2-0ec0-44db-9817-8b3116a2415b/volumes" Oct 02 12:26:57 crc kubenswrapper[4658]: I1002 12:26:57.430448 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:26:57 crc kubenswrapper[4658]: I1002 12:26:57.430983 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:27:27 crc kubenswrapper[4658]: I1002 12:27:27.429844 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:27:27 crc kubenswrapper[4658]: I1002 12:27:27.430265 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.211088 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzn65"] Oct 02 12:27:28 crc kubenswrapper[4658]: E1002 12:27:28.212826 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerName="extract-utilities" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.212913 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerName="extract-utilities" Oct 02 12:27:28 crc kubenswrapper[4658]: E1002 12:27:28.212991 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerName="extract-content" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.213065 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerName="extract-content" Oct 02 12:27:28 crc kubenswrapper[4658]: E1002 12:27:28.213138 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerName="registry-server" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.213189 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerName="registry-server" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.213505 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4f50a2-0ec0-44db-9817-8b3116a2415b" containerName="registry-server" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.215428 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.238540 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzn65"] Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.407728 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-utilities\") pod \"redhat-marketplace-gzn65\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.407824 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw942\" (UniqueName: \"kubernetes.io/projected/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-kube-api-access-mw942\") pod \"redhat-marketplace-gzn65\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.407883 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-catalog-content\") pod \"redhat-marketplace-gzn65\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.509854 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-utilities\") pod \"redhat-marketplace-gzn65\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.509934 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw942\" (UniqueName: \"kubernetes.io/projected/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-kube-api-access-mw942\") pod \"redhat-marketplace-gzn65\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.509979 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-catalog-content\") pod \"redhat-marketplace-gzn65\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.510498 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-catalog-content\") pod \"redhat-marketplace-gzn65\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.510756 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-utilities\") pod \"redhat-marketplace-gzn65\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.531690 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw942\" (UniqueName: \"kubernetes.io/projected/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-kube-api-access-mw942\") pod \"redhat-marketplace-gzn65\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:28 crc kubenswrapper[4658]: I1002 12:27:28.550446 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:29 crc kubenswrapper[4658]: W1002 12:27:29.103801 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7dabaab_f720_4ebc_a3be_3f32d2414cf9.slice/crio-27eb42647a461497ea39c4d7add93e8b87435a87bde7b99dde3dfd129dcf47e7 WatchSource:0}: Error finding container 27eb42647a461497ea39c4d7add93e8b87435a87bde7b99dde3dfd129dcf47e7: Status 404 returned error can't find the container with id 27eb42647a461497ea39c4d7add93e8b87435a87bde7b99dde3dfd129dcf47e7 Oct 02 12:27:29 crc kubenswrapper[4658]: I1002 12:27:29.104119 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzn65"] Oct 02 12:27:29 crc kubenswrapper[4658]: I1002 12:27:29.129686 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzn65" event={"ID":"c7dabaab-f720-4ebc-a3be-3f32d2414cf9","Type":"ContainerStarted","Data":"27eb42647a461497ea39c4d7add93e8b87435a87bde7b99dde3dfd129dcf47e7"} Oct 02 12:27:30 crc kubenswrapper[4658]: I1002 12:27:30.141354 4658 generic.go:334] "Generic (PLEG): container finished" podID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerID="fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650" exitCode=0 Oct 02 12:27:30 crc kubenswrapper[4658]: I1002 12:27:30.141626 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzn65" event={"ID":"c7dabaab-f720-4ebc-a3be-3f32d2414cf9","Type":"ContainerDied","Data":"fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650"} Oct 02 12:27:30 crc kubenswrapper[4658]: I1002 12:27:30.144398 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:27:31 crc kubenswrapper[4658]: I1002 12:27:31.155990 4658 generic.go:334] "Generic (PLEG): container finished" podID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerID="906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8" exitCode=0 Oct 02 12:27:31 crc kubenswrapper[4658]: I1002 12:27:31.156100 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzn65" event={"ID":"c7dabaab-f720-4ebc-a3be-3f32d2414cf9","Type":"ContainerDied","Data":"906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8"} Oct 02 12:27:32 crc kubenswrapper[4658]: I1002 12:27:32.166304 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzn65" event={"ID":"c7dabaab-f720-4ebc-a3be-3f32d2414cf9","Type":"ContainerStarted","Data":"12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be"} Oct 02 12:27:32 crc kubenswrapper[4658]: I1002 12:27:32.185049 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzn65" podStartSLOduration=2.775486858 podStartE2EDuration="4.185028255s" podCreationTimestamp="2025-10-02 12:27:28 +0000 UTC" firstStartedPulling="2025-10-02 12:27:30.144088404 +0000 UTC m=+4131.035241971" lastFinishedPulling="2025-10-02 12:27:31.553629801 +0000 UTC m=+4132.444783368" observedRunningTime="2025-10-02 12:27:32.180511779 +0000 UTC m=+4133.071665396" watchObservedRunningTime="2025-10-02 12:27:32.185028255 +0000 UTC m=+4133.076181842" Oct 02 12:27:38 crc kubenswrapper[4658]: I1002 12:27:38.551364 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:38 crc kubenswrapper[4658]: I1002 12:27:38.551899 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:38 crc kubenswrapper[4658]: I1002 12:27:38.603938 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:39 crc kubenswrapper[4658]: I1002 12:27:39.276896 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:39 crc kubenswrapper[4658]: I1002 12:27:39.323929 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzn65"] Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.249648 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gzn65" podUID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerName="registry-server" containerID="cri-o://12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be" gracePeriod=2 Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.771645 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.875837 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw942\" (UniqueName: \"kubernetes.io/projected/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-kube-api-access-mw942\") pod \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.876415 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-utilities\") pod \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.876444 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-catalog-content\") pod \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\" (UID: \"c7dabaab-f720-4ebc-a3be-3f32d2414cf9\") " Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.877415 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-utilities" (OuterVolumeSpecName: "utilities") pod "c7dabaab-f720-4ebc-a3be-3f32d2414cf9" (UID: "c7dabaab-f720-4ebc-a3be-3f32d2414cf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.882974 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-kube-api-access-mw942" (OuterVolumeSpecName: "kube-api-access-mw942") pod "c7dabaab-f720-4ebc-a3be-3f32d2414cf9" (UID: "c7dabaab-f720-4ebc-a3be-3f32d2414cf9"). InnerVolumeSpecName "kube-api-access-mw942". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.888457 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7dabaab-f720-4ebc-a3be-3f32d2414cf9" (UID: "c7dabaab-f720-4ebc-a3be-3f32d2414cf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.979051 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.979075 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:41 crc kubenswrapper[4658]: I1002 12:27:41.979086 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw942\" (UniqueName: \"kubernetes.io/projected/c7dabaab-f720-4ebc-a3be-3f32d2414cf9-kube-api-access-mw942\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.262478 4658 generic.go:334] "Generic (PLEG): container finished" podID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerID="12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be" exitCode=0 Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.262532 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzn65" event={"ID":"c7dabaab-f720-4ebc-a3be-3f32d2414cf9","Type":"ContainerDied","Data":"12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be"} Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.262563 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzn65" event={"ID":"c7dabaab-f720-4ebc-a3be-3f32d2414cf9","Type":"ContainerDied","Data":"27eb42647a461497ea39c4d7add93e8b87435a87bde7b99dde3dfd129dcf47e7"} Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.262559 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzn65" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.262579 4658 scope.go:117] "RemoveContainer" containerID="12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.293043 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzn65"] Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.295522 4658 scope.go:117] "RemoveContainer" containerID="906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.302029 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzn65"] Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.319404 4658 scope.go:117] "RemoveContainer" containerID="fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.369686 4658 scope.go:117] "RemoveContainer" containerID="12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be" Oct 02 12:27:42 crc kubenswrapper[4658]: E1002 12:27:42.370388 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be\": container with ID starting with 12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be not found: ID does not exist" containerID="12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.370419 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be"} err="failed to get container status \"12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be\": rpc error: code = NotFound desc = could not find container \"12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be\": container with ID starting with 12512d14d289abe51ceeb363a44dc2c4070d29588f72e479d51ba7981035b3be not found: ID does not exist" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.370442 4658 scope.go:117] "RemoveContainer" containerID="906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8" Oct 02 12:27:42 crc kubenswrapper[4658]: E1002 12:27:42.370763 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8\": container with ID starting with 906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8 not found: ID does not exist" containerID="906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.370800 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8"} err="failed to get container status \"906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8\": rpc error: code = NotFound desc = could not find container \"906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8\": container with ID starting with 906f6c914e3f4b12a4db06701bbc3c95e9beb64757a01c0fb025a32b93b05fd8 not found: ID does not exist" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.370815 4658 scope.go:117] "RemoveContainer" containerID="fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650" Oct 02 12:27:42 crc kubenswrapper[4658]: E1002 12:27:42.371140 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650\": container with ID starting with fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650 not found: ID does not exist" containerID="fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650" Oct 02 12:27:42 crc kubenswrapper[4658]: I1002 12:27:42.371168 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650"} err="failed to get container status \"fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650\": rpc error: code = NotFound desc = could not find container \"fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650\": container with ID starting with fbcf34710fcf4b27805fb0236b80de61c8f9368314fb68af462524e39714f650 not found: ID does not exist" Oct 02 12:27:43 crc kubenswrapper[4658]: I1002 12:27:43.961049 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" path="/var/lib/kubelet/pods/c7dabaab-f720-4ebc-a3be-3f32d2414cf9/volumes" Oct 02 12:27:57 crc kubenswrapper[4658]: I1002 12:27:57.429969 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:27:57 crc kubenswrapper[4658]: I1002 12:27:57.433532 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:27:57 crc kubenswrapper[4658]: I1002 12:27:57.433600 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:27:57 crc kubenswrapper[4658]: I1002 12:27:57.434558 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca7bac88d9b7f9e81ba33b516d70177becc7c1be6f07ae1bf2b99a4256cb835e"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:27:57 crc kubenswrapper[4658]: I1002 12:27:57.434623 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://ca7bac88d9b7f9e81ba33b516d70177becc7c1be6f07ae1bf2b99a4256cb835e" gracePeriod=600 Oct 02 12:27:58 crc kubenswrapper[4658]: I1002 12:27:58.467026 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="ca7bac88d9b7f9e81ba33b516d70177becc7c1be6f07ae1bf2b99a4256cb835e" exitCode=0 Oct 02 12:27:58 crc kubenswrapper[4658]: I1002 12:27:58.467063 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"ca7bac88d9b7f9e81ba33b516d70177becc7c1be6f07ae1bf2b99a4256cb835e"} Oct 02 12:27:58 crc kubenswrapper[4658]: I1002 12:27:58.467705 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b"} Oct 02 12:27:58 crc kubenswrapper[4658]: I1002 12:27:58.467728 4658 scope.go:117] "RemoveContainer" containerID="0458c10e3406418fa5d2532a0b0f42fb39eefbee3faec1626562ce5a0795b50d" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.106036 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ww887"] Oct 02 12:29:14 crc kubenswrapper[4658]: E1002 12:29:14.107052 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerName="registry-server" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.107069 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerName="registry-server" Oct 02 12:29:14 crc kubenswrapper[4658]: E1002 12:29:14.107103 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerName="extract-content" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.107109 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerName="extract-content" Oct 02 12:29:14 crc kubenswrapper[4658]: E1002 12:29:14.107137 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerName="extract-utilities" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.107145 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerName="extract-utilities" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.107387 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dabaab-f720-4ebc-a3be-3f32d2414cf9" containerName="registry-server" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.108789 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.117049 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ww887"] Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.190303 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-catalog-content\") pod \"redhat-operators-ww887\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.190471 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-utilities\") pod \"redhat-operators-ww887\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.190744 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcx6s\" (UniqueName: \"kubernetes.io/projected/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-kube-api-access-dcx6s\") pod \"redhat-operators-ww887\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.292884 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-catalog-content\") pod \"redhat-operators-ww887\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.293023 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-utilities\") pod \"redhat-operators-ww887\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.293189 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcx6s\" (UniqueName: \"kubernetes.io/projected/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-kube-api-access-dcx6s\") pod \"redhat-operators-ww887\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.293489 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-catalog-content\") pod \"redhat-operators-ww887\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.293554 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-utilities\") pod \"redhat-operators-ww887\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.315008 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcx6s\" (UniqueName: \"kubernetes.io/projected/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-kube-api-access-dcx6s\") pod \"redhat-operators-ww887\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:14 crc kubenswrapper[4658]: I1002 12:29:14.434403 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:15 crc kubenswrapper[4658]: I1002 12:29:15.010759 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ww887"] Oct 02 12:29:15 crc kubenswrapper[4658]: I1002 12:29:15.215734 4658 generic.go:334] "Generic (PLEG): container finished" podID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerID="c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad" exitCode=0 Oct 02 12:29:15 crc kubenswrapper[4658]: I1002 12:29:15.215989 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ww887" event={"ID":"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da","Type":"ContainerDied","Data":"c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad"} Oct 02 12:29:15 crc kubenswrapper[4658]: I1002 12:29:15.216019 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ww887" event={"ID":"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da","Type":"ContainerStarted","Data":"07721c69d46557b24db5685c17375091aed6ee84abed843a459a22f6507f26ff"} Oct 02 12:29:17 crc kubenswrapper[4658]: I1002 12:29:17.253698 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ww887" event={"ID":"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da","Type":"ContainerStarted","Data":"bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a"} Oct 02 12:29:18 crc kubenswrapper[4658]: I1002 12:29:18.266398 4658 generic.go:334] "Generic (PLEG): container finished" podID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerID="bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a" exitCode=0 Oct 02 12:29:18 crc kubenswrapper[4658]: I1002 12:29:18.266473 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ww887" event={"ID":"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da","Type":"ContainerDied","Data":"bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a"} Oct 02 12:29:19 crc kubenswrapper[4658]: I1002 12:29:19.278352 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ww887" event={"ID":"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da","Type":"ContainerStarted","Data":"e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a"} Oct 02 12:29:19 crc kubenswrapper[4658]: I1002 12:29:19.298671 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ww887" podStartSLOduration=1.8482322230000001 podStartE2EDuration="5.298646719s" podCreationTimestamp="2025-10-02 12:29:14 +0000 UTC" firstStartedPulling="2025-10-02 12:29:15.217664276 +0000 UTC m=+4236.108817843" lastFinishedPulling="2025-10-02 12:29:18.668078782 +0000 UTC m=+4239.559232339" observedRunningTime="2025-10-02 12:29:19.2939839 +0000 UTC m=+4240.185137477" watchObservedRunningTime="2025-10-02 12:29:19.298646719 +0000 UTC m=+4240.189800286" Oct 02 12:29:24 crc kubenswrapper[4658]: I1002 12:29:24.435347 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:24 crc kubenswrapper[4658]: I1002 12:29:24.435934 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:24 crc kubenswrapper[4658]: I1002 12:29:24.481938 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:25 crc kubenswrapper[4658]: I1002 12:29:25.376243 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:25 crc kubenswrapper[4658]: I1002 12:29:25.420259 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ww887"] Oct 02 12:29:27 crc kubenswrapper[4658]: I1002 12:29:27.348103 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ww887" podUID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerName="registry-server" containerID="cri-o://e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a" gracePeriod=2 Oct 02 12:29:27 crc kubenswrapper[4658]: I1002 12:29:27.915341 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.067380 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-catalog-content\") pod \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.067510 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcx6s\" (UniqueName: \"kubernetes.io/projected/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-kube-api-access-dcx6s\") pod \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.067714 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-utilities\") pod \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\" (UID: \"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da\") " Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.068513 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-utilities" (OuterVolumeSpecName: "utilities") pod "6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" (UID: "6bac3fa2-80a8-4c33-995f-ecb3f63ff1da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.074220 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-kube-api-access-dcx6s" (OuterVolumeSpecName: "kube-api-access-dcx6s") pod "6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" (UID: "6bac3fa2-80a8-4c33-995f-ecb3f63ff1da"). InnerVolumeSpecName "kube-api-access-dcx6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.150271 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" (UID: "6bac3fa2-80a8-4c33-995f-ecb3f63ff1da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.169833 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.169871 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.169887 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcx6s\" (UniqueName: \"kubernetes.io/projected/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da-kube-api-access-dcx6s\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.371893 4658 generic.go:334] "Generic (PLEG): container finished" podID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerID="e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a" exitCode=0 Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.372142 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ww887" event={"ID":"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da","Type":"ContainerDied","Data":"e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a"} Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.372256 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ww887" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.372356 4658 scope.go:117] "RemoveContainer" containerID="e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.372335 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ww887" event={"ID":"6bac3fa2-80a8-4c33-995f-ecb3f63ff1da","Type":"ContainerDied","Data":"07721c69d46557b24db5685c17375091aed6ee84abed843a459a22f6507f26ff"} Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.398467 4658 scope.go:117] "RemoveContainer" containerID="bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.407382 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ww887"] Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.421686 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ww887"] Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.426129 4658 scope.go:117] "RemoveContainer" containerID="c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.467761 4658 scope.go:117] "RemoveContainer" containerID="e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a" Oct 02 12:29:28 crc kubenswrapper[4658]: E1002 12:29:28.468165 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a\": container with ID starting with e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a not found: ID does not exist" containerID="e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.468200 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a"} err="failed to get container status \"e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a\": rpc error: code = NotFound desc = could not find container \"e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a\": container with ID starting with e72ab5dce4845762a8fe839b2264db61db3e417d2027450f345b849a6446c75a not found: ID does not exist" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.468238 4658 scope.go:117] "RemoveContainer" containerID="bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a" Oct 02 12:29:28 crc kubenswrapper[4658]: E1002 12:29:28.468677 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a\": container with ID starting with bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a not found: ID does not exist" containerID="bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.468707 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a"} err="failed to get container status \"bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a\": rpc error: code = NotFound desc = could not find container \"bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a\": container with ID starting with bae140fefc837203dd2799f4d2daab7a1b5ee5433ed036e564f80d44a67bf27a not found: ID does not exist" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.468725 4658 scope.go:117] "RemoveContainer" containerID="c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad" Oct 02 12:29:28 crc kubenswrapper[4658]: E1002 12:29:28.469274 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad\": container with ID starting with c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad not found: ID does not exist" containerID="c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad" Oct 02 12:29:28 crc kubenswrapper[4658]: I1002 12:29:28.469321 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad"} err="failed to get container status \"c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad\": rpc error: code = NotFound desc = could not find container \"c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad\": container with ID starting with c3c78267acdbd4efcb53863889cb61ad75b3fbf1e071cbd77cd78f06e9d748ad not found: ID does not exist" Oct 02 12:29:29 crc kubenswrapper[4658]: I1002 12:29:29.960701 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" path="/var/lib/kubelet/pods/6bac3fa2-80a8-4c33-995f-ecb3f63ff1da/volumes" Oct 02 12:29:57 crc kubenswrapper[4658]: I1002 12:29:57.430227 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:29:57 crc kubenswrapper[4658]: I1002 12:29:57.430802 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.144794 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m"] Oct 02 12:30:00 crc kubenswrapper[4658]: E1002 12:30:00.145418 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerName="extract-utilities" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.145430 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerName="extract-utilities" Oct 02 12:30:00 crc kubenswrapper[4658]: E1002 12:30:00.145446 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerName="registry-server" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.145461 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerName="registry-server" Oct 02 12:30:00 crc kubenswrapper[4658]: E1002 12:30:00.145489 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerName="extract-content" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.145495 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerName="extract-content" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.145820 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bac3fa2-80a8-4c33-995f-ecb3f63ff1da" containerName="registry-server" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.146501 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.149141 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.149507 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.157581 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m"] Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.303464 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2761d76a-b65d-4802-87f2-5f261215fe0a-config-volume\") pod \"collect-profiles-29323470-nfj5m\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.303749 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9vpm\" (UniqueName: \"kubernetes.io/projected/2761d76a-b65d-4802-87f2-5f261215fe0a-kube-api-access-c9vpm\") pod \"collect-profiles-29323470-nfj5m\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.303868 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2761d76a-b65d-4802-87f2-5f261215fe0a-secret-volume\") pod \"collect-profiles-29323470-nfj5m\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.405717 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2761d76a-b65d-4802-87f2-5f261215fe0a-config-volume\") pod \"collect-profiles-29323470-nfj5m\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.405788 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vpm\" (UniqueName: \"kubernetes.io/projected/2761d76a-b65d-4802-87f2-5f261215fe0a-kube-api-access-c9vpm\") pod \"collect-profiles-29323470-nfj5m\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.405922 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2761d76a-b65d-4802-87f2-5f261215fe0a-secret-volume\") pod \"collect-profiles-29323470-nfj5m\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.407319 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2761d76a-b65d-4802-87f2-5f261215fe0a-config-volume\") pod \"collect-profiles-29323470-nfj5m\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.412024 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2761d76a-b65d-4802-87f2-5f261215fe0a-secret-volume\") pod \"collect-profiles-29323470-nfj5m\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.425506 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9vpm\" (UniqueName: \"kubernetes.io/projected/2761d76a-b65d-4802-87f2-5f261215fe0a-kube-api-access-c9vpm\") pod \"collect-profiles-29323470-nfj5m\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.470244 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:00 crc kubenswrapper[4658]: I1002 12:30:00.948831 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m"] Oct 02 12:30:01 crc kubenswrapper[4658]: I1002 12:30:01.682446 4658 generic.go:334] "Generic (PLEG): container finished" podID="2761d76a-b65d-4802-87f2-5f261215fe0a" containerID="a8340729631e4ed8fbede47b489e7f3a0e614def2ca308cf2d815dfa4a91523c" exitCode=0 Oct 02 12:30:01 crc kubenswrapper[4658]: I1002 12:30:01.682544 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" event={"ID":"2761d76a-b65d-4802-87f2-5f261215fe0a","Type":"ContainerDied","Data":"a8340729631e4ed8fbede47b489e7f3a0e614def2ca308cf2d815dfa4a91523c"} Oct 02 12:30:01 crc kubenswrapper[4658]: I1002 12:30:01.682868 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" event={"ID":"2761d76a-b65d-4802-87f2-5f261215fe0a","Type":"ContainerStarted","Data":"e796ee458a3c995d73d4a7fd9d208a27b9a28aeca5c5c098dd55925885bbcb08"} Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.090391 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.186043 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2761d76a-b65d-4802-87f2-5f261215fe0a-secret-volume\") pod \"2761d76a-b65d-4802-87f2-5f261215fe0a\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.186328 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9vpm\" (UniqueName: \"kubernetes.io/projected/2761d76a-b65d-4802-87f2-5f261215fe0a-kube-api-access-c9vpm\") pod \"2761d76a-b65d-4802-87f2-5f261215fe0a\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.186392 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2761d76a-b65d-4802-87f2-5f261215fe0a-config-volume\") pod \"2761d76a-b65d-4802-87f2-5f261215fe0a\" (UID: \"2761d76a-b65d-4802-87f2-5f261215fe0a\") " Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.187409 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2761d76a-b65d-4802-87f2-5f261215fe0a-config-volume" (OuterVolumeSpecName: "config-volume") pod "2761d76a-b65d-4802-87f2-5f261215fe0a" (UID: "2761d76a-b65d-4802-87f2-5f261215fe0a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.193053 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2761d76a-b65d-4802-87f2-5f261215fe0a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2761d76a-b65d-4802-87f2-5f261215fe0a" (UID: "2761d76a-b65d-4802-87f2-5f261215fe0a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.193553 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2761d76a-b65d-4802-87f2-5f261215fe0a-kube-api-access-c9vpm" (OuterVolumeSpecName: "kube-api-access-c9vpm") pod "2761d76a-b65d-4802-87f2-5f261215fe0a" (UID: "2761d76a-b65d-4802-87f2-5f261215fe0a"). InnerVolumeSpecName "kube-api-access-c9vpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.288219 4658 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2761d76a-b65d-4802-87f2-5f261215fe0a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.288258 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9vpm\" (UniqueName: \"kubernetes.io/projected/2761d76a-b65d-4802-87f2-5f261215fe0a-kube-api-access-c9vpm\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.288268 4658 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2761d76a-b65d-4802-87f2-5f261215fe0a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.700779 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" event={"ID":"2761d76a-b65d-4802-87f2-5f261215fe0a","Type":"ContainerDied","Data":"e796ee458a3c995d73d4a7fd9d208a27b9a28aeca5c5c098dd55925885bbcb08"} Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.700828 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e796ee458a3c995d73d4a7fd9d208a27b9a28aeca5c5c098dd55925885bbcb08" Oct 02 12:30:03 crc kubenswrapper[4658]: I1002 12:30:03.700863 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-nfj5m" Oct 02 12:30:04 crc kubenswrapper[4658]: I1002 12:30:04.162511 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn"] Oct 02 12:30:04 crc kubenswrapper[4658]: I1002 12:30:04.172201 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-g2jcn"] Oct 02 12:30:05 crc kubenswrapper[4658]: I1002 12:30:05.964931 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b9acdf-9c53-4731-91a0-3126924ff057" path="/var/lib/kubelet/pods/71b9acdf-9c53-4731-91a0-3126924ff057/volumes" Oct 02 12:30:24 crc kubenswrapper[4658]: I1002 12:30:24.785507 4658 scope.go:117] "RemoveContainer" containerID="947a0011ab783f6fbb3dbd611b93135d065ed3f32f324114e33ba4baa74ec847" Oct 02 12:30:27 crc kubenswrapper[4658]: I1002 12:30:27.430359 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:30:27 crc kubenswrapper[4658]: I1002 12:30:27.430736 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:30:57 crc kubenswrapper[4658]: I1002 12:30:57.429784 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:30:57 crc kubenswrapper[4658]: I1002 12:30:57.430326 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:30:57 crc kubenswrapper[4658]: I1002 12:30:57.430376 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:30:57 crc kubenswrapper[4658]: I1002 12:30:57.431197 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:30:57 crc kubenswrapper[4658]: I1002 12:30:57.431253 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" gracePeriod=600 Oct 02 12:30:57 crc kubenswrapper[4658]: E1002 12:30:57.569495 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:30:58 crc kubenswrapper[4658]: I1002 12:30:58.233288 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" exitCode=0 Oct 02 12:30:58 crc kubenswrapper[4658]: I1002 12:30:58.233332 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b"} Oct 02 12:30:58 crc kubenswrapper[4658]: I1002 12:30:58.233720 4658 scope.go:117] "RemoveContainer" containerID="ca7bac88d9b7f9e81ba33b516d70177becc7c1be6f07ae1bf2b99a4256cb835e" Oct 02 12:30:58 crc kubenswrapper[4658]: I1002 12:30:58.234677 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:30:58 crc kubenswrapper[4658]: E1002 12:30:58.235118 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:31:08 crc kubenswrapper[4658]: I1002 12:31:08.949040 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:31:08 crc kubenswrapper[4658]: E1002 12:31:08.950019 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:31:20 crc kubenswrapper[4658]: I1002 12:31:20.949646 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:31:20 crc kubenswrapper[4658]: E1002 12:31:20.950704 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:31:32 crc kubenswrapper[4658]: I1002 12:31:32.949427 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:31:32 crc kubenswrapper[4658]: E1002 12:31:32.950267 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:31:45 crc kubenswrapper[4658]: I1002 12:31:45.949975 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:31:45 crc kubenswrapper[4658]: E1002 12:31:45.950863 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:31:57 crc kubenswrapper[4658]: I1002 12:31:57.949179 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:31:57 crc kubenswrapper[4658]: E1002 12:31:57.950901 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:32:10 crc kubenswrapper[4658]: I1002 12:32:10.949656 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:32:10 crc kubenswrapper[4658]: E1002 12:32:10.950431 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:32:24 crc kubenswrapper[4658]: I1002 12:32:24.950343 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:32:24 crc kubenswrapper[4658]: E1002 12:32:24.951941 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.354441 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fqtzd"] Oct 02 12:32:32 crc kubenswrapper[4658]: E1002 12:32:32.355784 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2761d76a-b65d-4802-87f2-5f261215fe0a" containerName="collect-profiles" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.355808 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="2761d76a-b65d-4802-87f2-5f261215fe0a" containerName="collect-profiles" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.356258 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="2761d76a-b65d-4802-87f2-5f261215fe0a" containerName="collect-profiles" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.358035 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.363782 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqtzd"] Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.467814 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-catalog-content\") pod \"community-operators-fqtzd\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.467958 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpcbk\" (UniqueName: \"kubernetes.io/projected/54067241-acda-48e7-986f-36b08764897c-kube-api-access-bpcbk\") pod \"community-operators-fqtzd\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.468008 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-utilities\") pod \"community-operators-fqtzd\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.570175 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpcbk\" (UniqueName: \"kubernetes.io/projected/54067241-acda-48e7-986f-36b08764897c-kube-api-access-bpcbk\") pod \"community-operators-fqtzd\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.570217 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-utilities\") pod \"community-operators-fqtzd\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.570433 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-catalog-content\") pod \"community-operators-fqtzd\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.570766 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-utilities\") pod \"community-operators-fqtzd\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.571047 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-catalog-content\") pod \"community-operators-fqtzd\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.590495 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpcbk\" (UniqueName: \"kubernetes.io/projected/54067241-acda-48e7-986f-36b08764897c-kube-api-access-bpcbk\") pod \"community-operators-fqtzd\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:32 crc kubenswrapper[4658]: I1002 12:32:32.688829 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:33 crc kubenswrapper[4658]: I1002 12:32:33.211073 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqtzd"] Oct 02 12:32:34 crc kubenswrapper[4658]: I1002 12:32:34.190153 4658 generic.go:334] "Generic (PLEG): container finished" podID="54067241-acda-48e7-986f-36b08764897c" containerID="006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a" exitCode=0 Oct 02 12:32:34 crc kubenswrapper[4658]: I1002 12:32:34.190221 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqtzd" event={"ID":"54067241-acda-48e7-986f-36b08764897c","Type":"ContainerDied","Data":"006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a"} Oct 02 12:32:34 crc kubenswrapper[4658]: I1002 12:32:34.191430 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqtzd" event={"ID":"54067241-acda-48e7-986f-36b08764897c","Type":"ContainerStarted","Data":"369fba562de2eb0d476b4d47a8a19cc29f8809e78d235a5cd6ea8f5fa4dee844"} Oct 02 12:32:34 crc kubenswrapper[4658]: I1002 12:32:34.193477 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:32:36 crc kubenswrapper[4658]: I1002 12:32:36.217580 4658 generic.go:334] "Generic (PLEG): container finished" podID="54067241-acda-48e7-986f-36b08764897c" containerID="5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70" exitCode=0 Oct 02 12:32:36 crc kubenswrapper[4658]: I1002 12:32:36.217724 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqtzd" event={"ID":"54067241-acda-48e7-986f-36b08764897c","Type":"ContainerDied","Data":"5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70"} Oct 02 12:32:36 crc kubenswrapper[4658]: I1002 12:32:36.950651 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:32:36 crc kubenswrapper[4658]: E1002 12:32:36.951469 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:32:37 crc kubenswrapper[4658]: I1002 12:32:37.229815 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqtzd" event={"ID":"54067241-acda-48e7-986f-36b08764897c","Type":"ContainerStarted","Data":"55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f"} Oct 02 12:32:37 crc kubenswrapper[4658]: I1002 12:32:37.267079 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fqtzd" podStartSLOduration=2.788612755 podStartE2EDuration="5.267057905s" podCreationTimestamp="2025-10-02 12:32:32 +0000 UTC" firstStartedPulling="2025-10-02 12:32:34.193160467 +0000 UTC m=+4435.084314034" lastFinishedPulling="2025-10-02 12:32:36.671605607 +0000 UTC m=+4437.562759184" observedRunningTime="2025-10-02 12:32:37.255670558 +0000 UTC m=+4438.146824125" watchObservedRunningTime="2025-10-02 12:32:37.267057905 +0000 UTC m=+4438.158211492" Oct 02 12:32:42 crc kubenswrapper[4658]: I1002 12:32:42.689330 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:42 crc kubenswrapper[4658]: I1002 12:32:42.689607 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:42 crc kubenswrapper[4658]: I1002 12:32:42.733285 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:43 crc kubenswrapper[4658]: I1002 12:32:43.361348 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:43 crc kubenswrapper[4658]: I1002 12:32:43.411113 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqtzd"] Oct 02 12:32:45 crc kubenswrapper[4658]: I1002 12:32:45.316885 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fqtzd" podUID="54067241-acda-48e7-986f-36b08764897c" containerName="registry-server" containerID="cri-o://55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f" gracePeriod=2 Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.241349 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.328090 4658 generic.go:334] "Generic (PLEG): container finished" podID="54067241-acda-48e7-986f-36b08764897c" containerID="55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f" exitCode=0 Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.328179 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqtzd" event={"ID":"54067241-acda-48e7-986f-36b08764897c","Type":"ContainerDied","Data":"55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f"} Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.328206 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqtzd" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.328229 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqtzd" event={"ID":"54067241-acda-48e7-986f-36b08764897c","Type":"ContainerDied","Data":"369fba562de2eb0d476b4d47a8a19cc29f8809e78d235a5cd6ea8f5fa4dee844"} Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.328265 4658 scope.go:117] "RemoveContainer" containerID="55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.357767 4658 scope.go:117] "RemoveContainer" containerID="5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.357945 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-catalog-content\") pod \"54067241-acda-48e7-986f-36b08764897c\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.358172 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpcbk\" (UniqueName: \"kubernetes.io/projected/54067241-acda-48e7-986f-36b08764897c-kube-api-access-bpcbk\") pod \"54067241-acda-48e7-986f-36b08764897c\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.358374 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-utilities\") pod \"54067241-acda-48e7-986f-36b08764897c\" (UID: \"54067241-acda-48e7-986f-36b08764897c\") " Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.359697 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-utilities" (OuterVolumeSpecName: "utilities") pod "54067241-acda-48e7-986f-36b08764897c" (UID: "54067241-acda-48e7-986f-36b08764897c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.366594 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54067241-acda-48e7-986f-36b08764897c-kube-api-access-bpcbk" (OuterVolumeSpecName: "kube-api-access-bpcbk") pod "54067241-acda-48e7-986f-36b08764897c" (UID: "54067241-acda-48e7-986f-36b08764897c"). InnerVolumeSpecName "kube-api-access-bpcbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.379880 4658 scope.go:117] "RemoveContainer" containerID="006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.418430 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54067241-acda-48e7-986f-36b08764897c" (UID: "54067241-acda-48e7-986f-36b08764897c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.460971 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpcbk\" (UniqueName: \"kubernetes.io/projected/54067241-acda-48e7-986f-36b08764897c-kube-api-access-bpcbk\") on node \"crc\" DevicePath \"\"" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.461022 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.461036 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54067241-acda-48e7-986f-36b08764897c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.468345 4658 scope.go:117] "RemoveContainer" containerID="55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f" Oct 02 12:32:46 crc kubenswrapper[4658]: E1002 12:32:46.468778 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f\": container with ID starting with 55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f not found: ID does not exist" containerID="55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.468824 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f"} err="failed to get container status \"55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f\": rpc error: code = NotFound desc = could not find container \"55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f\": container with ID starting with 55eabbce4e472439d63b34780a6351b4466161712a3286b3f34429f489aaf44f not found: ID does not exist" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.468849 4658 scope.go:117] "RemoveContainer" containerID="5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70" Oct 02 12:32:46 crc kubenswrapper[4658]: E1002 12:32:46.469286 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70\": container with ID starting with 5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70 not found: ID does not exist" containerID="5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.469367 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70"} err="failed to get container status \"5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70\": rpc error: code = NotFound desc = could not find container \"5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70\": container with ID starting with 5976e57ec279ea9e28f1c1d9472dc936d41040ba525d9fd9b5af6dd7a1e45e70 not found: ID does not exist" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.469381 4658 scope.go:117] "RemoveContainer" containerID="006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a" Oct 02 12:32:46 crc kubenswrapper[4658]: E1002 12:32:46.469653 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a\": container with ID starting with 006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a not found: ID does not exist" containerID="006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.469687 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a"} err="failed to get container status \"006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a\": rpc error: code = NotFound desc = could not find container \"006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a\": container with ID starting with 006d5c82bc418682c06a8a92c13b1671393327bbdf9633766b19992ab75fee9a not found: ID does not exist" Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.661762 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqtzd"] Oct 02 12:32:46 crc kubenswrapper[4658]: I1002 12:32:46.672105 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fqtzd"] Oct 02 12:32:47 crc kubenswrapper[4658]: I1002 12:32:47.971547 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54067241-acda-48e7-986f-36b08764897c" path="/var/lib/kubelet/pods/54067241-acda-48e7-986f-36b08764897c/volumes" Oct 02 12:32:51 crc kubenswrapper[4658]: I1002 12:32:51.949484 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:32:51 crc kubenswrapper[4658]: E1002 12:32:51.950429 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:33:02 crc kubenswrapper[4658]: I1002 12:33:02.949271 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:33:02 crc kubenswrapper[4658]: E1002 12:33:02.950042 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:33:15 crc kubenswrapper[4658]: I1002 12:33:15.948842 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:33:15 crc kubenswrapper[4658]: E1002 12:33:15.949602 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:33:30 crc kubenswrapper[4658]: I1002 12:33:30.950053 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:33:30 crc kubenswrapper[4658]: E1002 12:33:30.951178 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:33:44 crc kubenswrapper[4658]: I1002 12:33:44.950157 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:33:44 crc kubenswrapper[4658]: E1002 12:33:44.950906 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:33:57 crc kubenswrapper[4658]: I1002 12:33:57.949777 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:33:57 crc kubenswrapper[4658]: E1002 12:33:57.950641 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:34:12 crc kubenswrapper[4658]: I1002 12:34:12.949015 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:34:12 crc kubenswrapper[4658]: E1002 12:34:12.951591 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:34:26 crc kubenswrapper[4658]: I1002 12:34:26.949835 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:34:26 crc kubenswrapper[4658]: E1002 12:34:26.950823 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:34:41 crc kubenswrapper[4658]: I1002 12:34:41.949892 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:34:41 crc kubenswrapper[4658]: E1002 12:34:41.950546 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:34:54 crc kubenswrapper[4658]: I1002 12:34:54.948762 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:34:54 crc kubenswrapper[4658]: E1002 12:34:54.949711 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:35:06 crc kubenswrapper[4658]: I1002 12:35:06.949646 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:35:06 crc kubenswrapper[4658]: E1002 12:35:06.950683 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:35:17 crc kubenswrapper[4658]: I1002 12:35:17.953604 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:35:17 crc kubenswrapper[4658]: E1002 12:35:17.954341 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:35:28 crc kubenswrapper[4658]: I1002 12:35:28.951180 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:35:28 crc kubenswrapper[4658]: E1002 12:35:28.952232 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:35:39 crc kubenswrapper[4658]: I1002 12:35:39.968076 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:35:39 crc kubenswrapper[4658]: E1002 12:35:39.969736 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:35:53 crc kubenswrapper[4658]: I1002 12:35:53.949059 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:35:53 crc kubenswrapper[4658]: E1002 12:35:53.949943 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:36:05 crc kubenswrapper[4658]: I1002 12:36:05.949962 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:36:06 crc kubenswrapper[4658]: I1002 12:36:06.370430 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"1c8a86971304afa5d0350daaf2e54a3f3944c3a6bd4d8c870317dd730d29eda5"} Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.016665 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f99jw"] Oct 02 12:36:40 crc kubenswrapper[4658]: E1002 12:36:40.017501 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54067241-acda-48e7-986f-36b08764897c" containerName="extract-utilities" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.017514 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="54067241-acda-48e7-986f-36b08764897c" containerName="extract-utilities" Oct 02 12:36:40 crc kubenswrapper[4658]: E1002 12:36:40.017537 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54067241-acda-48e7-986f-36b08764897c" containerName="extract-content" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.017544 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="54067241-acda-48e7-986f-36b08764897c" containerName="extract-content" Oct 02 12:36:40 crc kubenswrapper[4658]: E1002 12:36:40.017594 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54067241-acda-48e7-986f-36b08764897c" containerName="registry-server" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.017601 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="54067241-acda-48e7-986f-36b08764897c" containerName="registry-server" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.017815 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="54067241-acda-48e7-986f-36b08764897c" containerName="registry-server" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.020585 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.025145 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f99jw"] Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.211260 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnf5q\" (UniqueName: \"kubernetes.io/projected/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-kube-api-access-wnf5q\") pod \"certified-operators-f99jw\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.211368 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-catalog-content\") pod \"certified-operators-f99jw\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.211473 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-utilities\") pod \"certified-operators-f99jw\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.313061 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnf5q\" (UniqueName: \"kubernetes.io/projected/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-kube-api-access-wnf5q\") pod \"certified-operators-f99jw\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.313136 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-catalog-content\") pod \"certified-operators-f99jw\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.313238 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-utilities\") pod \"certified-operators-f99jw\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.313698 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-catalog-content\") pod \"certified-operators-f99jw\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.313746 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-utilities\") pod \"certified-operators-f99jw\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.335012 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnf5q\" (UniqueName: \"kubernetes.io/projected/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-kube-api-access-wnf5q\") pod \"certified-operators-f99jw\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.348899 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:40 crc kubenswrapper[4658]: I1002 12:36:40.896970 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f99jw"] Oct 02 12:36:41 crc kubenswrapper[4658]: E1002 12:36:41.250325 4658 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a0e7c1_a4ce_4331_8ed0_484f6c528f23.slice/crio-conmon-7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:36:41 crc kubenswrapper[4658]: I1002 12:36:41.700877 4658 generic.go:334] "Generic (PLEG): container finished" podID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerID="7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7" exitCode=0 Oct 02 12:36:41 crc kubenswrapper[4658]: I1002 12:36:41.701038 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f99jw" event={"ID":"04a0e7c1-a4ce-4331-8ed0-484f6c528f23","Type":"ContainerDied","Data":"7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7"} Oct 02 12:36:41 crc kubenswrapper[4658]: I1002 12:36:41.701192 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f99jw" event={"ID":"04a0e7c1-a4ce-4331-8ed0-484f6c528f23","Type":"ContainerStarted","Data":"350f5bb56899ab37b05f5c28d6b2c138c91174ba389afeba08721a6fcb260f3f"} Oct 02 12:36:42 crc kubenswrapper[4658]: I1002 12:36:42.712566 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f99jw" event={"ID":"04a0e7c1-a4ce-4331-8ed0-484f6c528f23","Type":"ContainerStarted","Data":"2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a"} Oct 02 12:36:43 crc kubenswrapper[4658]: I1002 12:36:43.726491 4658 generic.go:334] "Generic (PLEG): container finished" podID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerID="2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a" exitCode=0 Oct 02 12:36:43 crc kubenswrapper[4658]: I1002 12:36:43.726572 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f99jw" event={"ID":"04a0e7c1-a4ce-4331-8ed0-484f6c528f23","Type":"ContainerDied","Data":"2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a"} Oct 02 12:36:44 crc kubenswrapper[4658]: I1002 12:36:44.740627 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f99jw" event={"ID":"04a0e7c1-a4ce-4331-8ed0-484f6c528f23","Type":"ContainerStarted","Data":"40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934"} Oct 02 12:36:44 crc kubenswrapper[4658]: I1002 12:36:44.765411 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f99jw" podStartSLOduration=3.313946797 podStartE2EDuration="5.765386148s" podCreationTimestamp="2025-10-02 12:36:39 +0000 UTC" firstStartedPulling="2025-10-02 12:36:41.703630231 +0000 UTC m=+4682.594783788" lastFinishedPulling="2025-10-02 12:36:44.155069562 +0000 UTC m=+4685.046223139" observedRunningTime="2025-10-02 12:36:44.758151376 +0000 UTC m=+4685.649304943" watchObservedRunningTime="2025-10-02 12:36:44.765386148 +0000 UTC m=+4685.656539715" Oct 02 12:36:50 crc kubenswrapper[4658]: I1002 12:36:50.349718 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:50 crc kubenswrapper[4658]: I1002 12:36:50.350447 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:50 crc kubenswrapper[4658]: I1002 12:36:50.416400 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:50 crc kubenswrapper[4658]: I1002 12:36:50.865711 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:50 crc kubenswrapper[4658]: I1002 12:36:50.908109 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f99jw"] Oct 02 12:36:52 crc kubenswrapper[4658]: I1002 12:36:52.821845 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f99jw" podUID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerName="registry-server" containerID="cri-o://40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934" gracePeriod=2 Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.394040 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.577401 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-catalog-content\") pod \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.577534 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-utilities\") pod \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.577595 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnf5q\" (UniqueName: \"kubernetes.io/projected/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-kube-api-access-wnf5q\") pod \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\" (UID: \"04a0e7c1-a4ce-4331-8ed0-484f6c528f23\") " Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.578861 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-utilities" (OuterVolumeSpecName: "utilities") pod "04a0e7c1-a4ce-4331-8ed0-484f6c528f23" (UID: "04a0e7c1-a4ce-4331-8ed0-484f6c528f23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.583282 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-kube-api-access-wnf5q" (OuterVolumeSpecName: "kube-api-access-wnf5q") pod "04a0e7c1-a4ce-4331-8ed0-484f6c528f23" (UID: "04a0e7c1-a4ce-4331-8ed0-484f6c528f23"). InnerVolumeSpecName "kube-api-access-wnf5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.616848 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04a0e7c1-a4ce-4331-8ed0-484f6c528f23" (UID: "04a0e7c1-a4ce-4331-8ed0-484f6c528f23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.680119 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.680521 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnf5q\" (UniqueName: \"kubernetes.io/projected/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-kube-api-access-wnf5q\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.680537 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a0e7c1-a4ce-4331-8ed0-484f6c528f23-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.840339 4658 generic.go:334] "Generic (PLEG): container finished" podID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerID="40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934" exitCode=0 Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.840458 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f99jw" event={"ID":"04a0e7c1-a4ce-4331-8ed0-484f6c528f23","Type":"ContainerDied","Data":"40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934"} Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.841651 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f99jw" event={"ID":"04a0e7c1-a4ce-4331-8ed0-484f6c528f23","Type":"ContainerDied","Data":"350f5bb56899ab37b05f5c28d6b2c138c91174ba389afeba08721a6fcb260f3f"} Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.840506 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f99jw" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.841700 4658 scope.go:117] "RemoveContainer" containerID="40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.882265 4658 scope.go:117] "RemoveContainer" containerID="2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.904487 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f99jw"] Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.912512 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f99jw"] Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.920184 4658 scope.go:117] "RemoveContainer" containerID="7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.959326 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" path="/var/lib/kubelet/pods/04a0e7c1-a4ce-4331-8ed0-484f6c528f23/volumes" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.976392 4658 scope.go:117] "RemoveContainer" containerID="40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934" Oct 02 12:36:53 crc kubenswrapper[4658]: E1002 12:36:53.976897 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934\": container with ID starting with 40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934 not found: ID does not exist" containerID="40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.976946 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934"} err="failed to get container status \"40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934\": rpc error: code = NotFound desc = could not find container \"40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934\": container with ID starting with 40e4de4cb3753f0dda8cfc1bf1aaf4757ad29b582c0e7136bc3d6fd34e466934 not found: ID does not exist" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.976979 4658 scope.go:117] "RemoveContainer" containerID="2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a" Oct 02 12:36:53 crc kubenswrapper[4658]: E1002 12:36:53.977567 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a\": container with ID starting with 2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a not found: ID does not exist" containerID="2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.977610 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a"} err="failed to get container status \"2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a\": rpc error: code = NotFound desc = could not find container \"2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a\": container with ID starting with 2867b6a5e24bc7655afc6423c493a0cac5e007defbc3ebca24b75c209328509a not found: ID does not exist" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.977638 4658 scope.go:117] "RemoveContainer" containerID="7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7" Oct 02 12:36:53 crc kubenswrapper[4658]: E1002 12:36:53.977930 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7\": container with ID starting with 7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7 not found: ID does not exist" containerID="7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7" Oct 02 12:36:53 crc kubenswrapper[4658]: I1002 12:36:53.977991 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7"} err="failed to get container status \"7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7\": rpc error: code = NotFound desc = could not find container \"7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7\": container with ID starting with 7359b6939606e0ea557fc556fe5afee03a5957cefdba318fc1fba7f64945a9e7 not found: ID does not exist" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.056864 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jp8t9"] Oct 02 12:37:42 crc kubenswrapper[4658]: E1002 12:37:42.058828 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerName="registry-server" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.058850 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerName="registry-server" Oct 02 12:37:42 crc kubenswrapper[4658]: E1002 12:37:42.058933 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerName="extract-utilities" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.058946 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerName="extract-utilities" Oct 02 12:37:42 crc kubenswrapper[4658]: E1002 12:37:42.058982 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerName="extract-content" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.058997 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerName="extract-content" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.060432 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a0e7c1-a4ce-4331-8ed0-484f6c528f23" containerName="registry-server" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.068420 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.094766 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp8t9"] Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.147999 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-utilities\") pod \"redhat-marketplace-jp8t9\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.148069 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx2zj\" (UniqueName: \"kubernetes.io/projected/89b92b5e-986f-4252-beb3-86ba738d46e3-kube-api-access-gx2zj\") pod \"redhat-marketplace-jp8t9\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.148169 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-catalog-content\") pod \"redhat-marketplace-jp8t9\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.249954 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-catalog-content\") pod \"redhat-marketplace-jp8t9\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.250126 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-utilities\") pod \"redhat-marketplace-jp8t9\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.250147 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx2zj\" (UniqueName: \"kubernetes.io/projected/89b92b5e-986f-4252-beb3-86ba738d46e3-kube-api-access-gx2zj\") pod \"redhat-marketplace-jp8t9\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.250504 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-catalog-content\") pod \"redhat-marketplace-jp8t9\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.250608 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-utilities\") pod \"redhat-marketplace-jp8t9\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.270934 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx2zj\" (UniqueName: \"kubernetes.io/projected/89b92b5e-986f-4252-beb3-86ba738d46e3-kube-api-access-gx2zj\") pod \"redhat-marketplace-jp8t9\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.395586 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:42 crc kubenswrapper[4658]: W1002 12:37:42.859564 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b92b5e_986f_4252_beb3_86ba738d46e3.slice/crio-00813c660bfb1c4a70bdd249506e5841110ba6cef3cb2262766e68aacba0713c WatchSource:0}: Error finding container 00813c660bfb1c4a70bdd249506e5841110ba6cef3cb2262766e68aacba0713c: Status 404 returned error can't find the container with id 00813c660bfb1c4a70bdd249506e5841110ba6cef3cb2262766e68aacba0713c Oct 02 12:37:42 crc kubenswrapper[4658]: I1002 12:37:42.865345 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp8t9"] Oct 02 12:37:43 crc kubenswrapper[4658]: I1002 12:37:43.325168 4658 generic.go:334] "Generic (PLEG): container finished" podID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerID="97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481" exitCode=0 Oct 02 12:37:43 crc kubenswrapper[4658]: I1002 12:37:43.325266 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp8t9" event={"ID":"89b92b5e-986f-4252-beb3-86ba738d46e3","Type":"ContainerDied","Data":"97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481"} Oct 02 12:37:43 crc kubenswrapper[4658]: I1002 12:37:43.325552 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp8t9" event={"ID":"89b92b5e-986f-4252-beb3-86ba738d46e3","Type":"ContainerStarted","Data":"00813c660bfb1c4a70bdd249506e5841110ba6cef3cb2262766e68aacba0713c"} Oct 02 12:37:43 crc kubenswrapper[4658]: I1002 12:37:43.327213 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:37:44 crc kubenswrapper[4658]: I1002 12:37:44.337720 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp8t9" event={"ID":"89b92b5e-986f-4252-beb3-86ba738d46e3","Type":"ContainerStarted","Data":"25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c"} Oct 02 12:37:45 crc kubenswrapper[4658]: I1002 12:37:45.347655 4658 generic.go:334] "Generic (PLEG): container finished" podID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerID="25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c" exitCode=0 Oct 02 12:37:45 crc kubenswrapper[4658]: I1002 12:37:45.347703 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp8t9" event={"ID":"89b92b5e-986f-4252-beb3-86ba738d46e3","Type":"ContainerDied","Data":"25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c"} Oct 02 12:37:46 crc kubenswrapper[4658]: I1002 12:37:46.360943 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp8t9" event={"ID":"89b92b5e-986f-4252-beb3-86ba738d46e3","Type":"ContainerStarted","Data":"a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e"} Oct 02 12:37:46 crc kubenswrapper[4658]: I1002 12:37:46.386793 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jp8t9" podStartSLOduration=1.952486748 podStartE2EDuration="4.386772328s" podCreationTimestamp="2025-10-02 12:37:42 +0000 UTC" firstStartedPulling="2025-10-02 12:37:43.326992604 +0000 UTC m=+4744.218146171" lastFinishedPulling="2025-10-02 12:37:45.761278184 +0000 UTC m=+4746.652431751" observedRunningTime="2025-10-02 12:37:46.381400975 +0000 UTC m=+4747.272554542" watchObservedRunningTime="2025-10-02 12:37:46.386772328 +0000 UTC m=+4747.277925895" Oct 02 12:37:52 crc kubenswrapper[4658]: I1002 12:37:52.396488 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:52 crc kubenswrapper[4658]: I1002 12:37:52.397476 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:52 crc kubenswrapper[4658]: I1002 12:37:52.458389 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:52 crc kubenswrapper[4658]: I1002 12:37:52.512836 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:52 crc kubenswrapper[4658]: I1002 12:37:52.698331 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp8t9"] Oct 02 12:37:54 crc kubenswrapper[4658]: I1002 12:37:54.446993 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jp8t9" podUID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerName="registry-server" containerID="cri-o://a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e" gracePeriod=2 Oct 02 12:37:54 crc kubenswrapper[4658]: I1002 12:37:54.983209 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.122894 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx2zj\" (UniqueName: \"kubernetes.io/projected/89b92b5e-986f-4252-beb3-86ba738d46e3-kube-api-access-gx2zj\") pod \"89b92b5e-986f-4252-beb3-86ba738d46e3\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.123123 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-catalog-content\") pod \"89b92b5e-986f-4252-beb3-86ba738d46e3\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.124661 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-utilities\") pod \"89b92b5e-986f-4252-beb3-86ba738d46e3\" (UID: \"89b92b5e-986f-4252-beb3-86ba738d46e3\") " Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.125725 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-utilities" (OuterVolumeSpecName: "utilities") pod "89b92b5e-986f-4252-beb3-86ba738d46e3" (UID: "89b92b5e-986f-4252-beb3-86ba738d46e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.128563 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b92b5e-986f-4252-beb3-86ba738d46e3-kube-api-access-gx2zj" (OuterVolumeSpecName: "kube-api-access-gx2zj") pod "89b92b5e-986f-4252-beb3-86ba738d46e3" (UID: "89b92b5e-986f-4252-beb3-86ba738d46e3"). InnerVolumeSpecName "kube-api-access-gx2zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.138979 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89b92b5e-986f-4252-beb3-86ba738d46e3" (UID: "89b92b5e-986f-4252-beb3-86ba738d46e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.227155 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.227203 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b92b5e-986f-4252-beb3-86ba738d46e3-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.227216 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx2zj\" (UniqueName: \"kubernetes.io/projected/89b92b5e-986f-4252-beb3-86ba738d46e3-kube-api-access-gx2zj\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.457030 4658 generic.go:334] "Generic (PLEG): container finished" podID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerID="a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e" exitCode=0 Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.457078 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp8t9" event={"ID":"89b92b5e-986f-4252-beb3-86ba738d46e3","Type":"ContainerDied","Data":"a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e"} Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.457143 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp8t9" event={"ID":"89b92b5e-986f-4252-beb3-86ba738d46e3","Type":"ContainerDied","Data":"00813c660bfb1c4a70bdd249506e5841110ba6cef3cb2262766e68aacba0713c"} Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.457170 4658 scope.go:117] "RemoveContainer" containerID="a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.457100 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp8t9" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.477184 4658 scope.go:117] "RemoveContainer" containerID="25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.491505 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp8t9"] Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.500062 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp8t9"] Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.512967 4658 scope.go:117] "RemoveContainer" containerID="97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.543915 4658 scope.go:117] "RemoveContainer" containerID="a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e" Oct 02 12:37:55 crc kubenswrapper[4658]: E1002 12:37:55.545120 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e\": container with ID starting with a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e not found: ID does not exist" containerID="a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.545152 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e"} err="failed to get container status \"a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e\": rpc error: code = NotFound desc = could not find container \"a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e\": container with ID starting with a2a3610fef17d16672e4606d6c7709d9428db31d4681fea21e0a0cdb4002d26e not found: ID does not exist" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.545173 4658 scope.go:117] "RemoveContainer" containerID="25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c" Oct 02 12:37:55 crc kubenswrapper[4658]: E1002 12:37:55.545529 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c\": container with ID starting with 25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c not found: ID does not exist" containerID="25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.545553 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c"} err="failed to get container status \"25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c\": rpc error: code = NotFound desc = could not find container \"25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c\": container with ID starting with 25e6e0dca883bb9f22b3a08ff12e372f0e7b38ed97702c0af12c468e3ff21d4c not found: ID does not exist" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.545594 4658 scope.go:117] "RemoveContainer" containerID="97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481" Oct 02 12:37:55 crc kubenswrapper[4658]: E1002 12:37:55.546077 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481\": container with ID starting with 97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481 not found: ID does not exist" containerID="97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.546122 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481"} err="failed to get container status \"97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481\": rpc error: code = NotFound desc = could not find container \"97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481\": container with ID starting with 97059a586c49fc4c6bdd3b0c640a7cf8ad4b032835d6ef874f2a8d5969f17481 not found: ID does not exist" Oct 02 12:37:55 crc kubenswrapper[4658]: I1002 12:37:55.961069 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b92b5e-986f-4252-beb3-86ba738d46e3" path="/var/lib/kubelet/pods/89b92b5e-986f-4252-beb3-86ba738d46e3/volumes" Oct 02 12:38:27 crc kubenswrapper[4658]: I1002 12:38:27.429869 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:38:27 crc kubenswrapper[4658]: I1002 12:38:27.430740 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:38:57 crc kubenswrapper[4658]: I1002 12:38:57.429829 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:38:57 crc kubenswrapper[4658]: I1002 12:38:57.430381 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:39:27 crc kubenswrapper[4658]: I1002 12:39:27.429632 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:39:27 crc kubenswrapper[4658]: I1002 12:39:27.430274 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:39:27 crc kubenswrapper[4658]: I1002 12:39:27.430346 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:39:27 crc kubenswrapper[4658]: I1002 12:39:27.431145 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c8a86971304afa5d0350daaf2e54a3f3944c3a6bd4d8c870317dd730d29eda5"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:39:27 crc kubenswrapper[4658]: I1002 12:39:27.431209 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://1c8a86971304afa5d0350daaf2e54a3f3944c3a6bd4d8c870317dd730d29eda5" gracePeriod=600 Oct 02 12:39:28 crc kubenswrapper[4658]: I1002 12:39:28.407007 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="1c8a86971304afa5d0350daaf2e54a3f3944c3a6bd4d8c870317dd730d29eda5" exitCode=0 Oct 02 12:39:28 crc kubenswrapper[4658]: I1002 12:39:28.407065 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"1c8a86971304afa5d0350daaf2e54a3f3944c3a6bd4d8c870317dd730d29eda5"} Oct 02 12:39:28 crc kubenswrapper[4658]: I1002 12:39:28.407614 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86"} Oct 02 12:39:28 crc kubenswrapper[4658]: I1002 12:39:28.407638 4658 scope.go:117] "RemoveContainer" containerID="f95b5d7471f785908fdc45f9350b0e75af0a75f1906f2e01801d5851ee1c018b" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.505424 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9hzp6"] Oct 02 12:40:01 crc kubenswrapper[4658]: E1002 12:40:01.506634 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerName="extract-utilities" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.506659 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerName="extract-utilities" Oct 02 12:40:01 crc kubenswrapper[4658]: E1002 12:40:01.506705 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerName="registry-server" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.506719 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerName="registry-server" Oct 02 12:40:01 crc kubenswrapper[4658]: E1002 12:40:01.506790 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerName="extract-content" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.506804 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerName="extract-content" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.507132 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b92b5e-986f-4252-beb3-86ba738d46e3" containerName="registry-server" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.509107 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.528449 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9hzp6"] Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.604670 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-catalog-content\") pod \"redhat-operators-9hzp6\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.604838 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-utilities\") pod \"redhat-operators-9hzp6\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.605027 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdzqt\" (UniqueName: \"kubernetes.io/projected/fb1a6bdc-cc68-4061-b203-94ffc445ca74-kube-api-access-bdzqt\") pod \"redhat-operators-9hzp6\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.706796 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-catalog-content\") pod \"redhat-operators-9hzp6\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.706896 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-utilities\") pod \"redhat-operators-9hzp6\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.707024 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdzqt\" (UniqueName: \"kubernetes.io/projected/fb1a6bdc-cc68-4061-b203-94ffc445ca74-kube-api-access-bdzqt\") pod \"redhat-operators-9hzp6\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.707586 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-catalog-content\") pod \"redhat-operators-9hzp6\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:01 crc kubenswrapper[4658]: I1002 12:40:01.707714 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-utilities\") pod \"redhat-operators-9hzp6\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:02 crc kubenswrapper[4658]: I1002 12:40:02.183533 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdzqt\" (UniqueName: \"kubernetes.io/projected/fb1a6bdc-cc68-4061-b203-94ffc445ca74-kube-api-access-bdzqt\") pod \"redhat-operators-9hzp6\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:02 crc kubenswrapper[4658]: I1002 12:40:02.429352 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:03 crc kubenswrapper[4658]: I1002 12:40:03.089385 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9hzp6"] Oct 02 12:40:03 crc kubenswrapper[4658]: I1002 12:40:03.749118 4658 generic.go:334] "Generic (PLEG): container finished" podID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerID="34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8" exitCode=0 Oct 02 12:40:03 crc kubenswrapper[4658]: I1002 12:40:03.749178 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hzp6" event={"ID":"fb1a6bdc-cc68-4061-b203-94ffc445ca74","Type":"ContainerDied","Data":"34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8"} Oct 02 12:40:03 crc kubenswrapper[4658]: I1002 12:40:03.749504 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hzp6" event={"ID":"fb1a6bdc-cc68-4061-b203-94ffc445ca74","Type":"ContainerStarted","Data":"12c25d42cdaac636a3b0e1c4d024e55b94f2ec2a50fc67078872feabf07bb9ed"} Oct 02 12:40:07 crc kubenswrapper[4658]: I1002 12:40:07.792150 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hzp6" event={"ID":"fb1a6bdc-cc68-4061-b203-94ffc445ca74","Type":"ContainerStarted","Data":"7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c"} Oct 02 12:40:16 crc kubenswrapper[4658]: I1002 12:40:16.893717 4658 generic.go:334] "Generic (PLEG): container finished" podID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerID="7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c" exitCode=0 Oct 02 12:40:16 crc kubenswrapper[4658]: I1002 12:40:16.894361 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hzp6" event={"ID":"fb1a6bdc-cc68-4061-b203-94ffc445ca74","Type":"ContainerDied","Data":"7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c"} Oct 02 12:40:19 crc kubenswrapper[4658]: I1002 12:40:19.935215 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hzp6" event={"ID":"fb1a6bdc-cc68-4061-b203-94ffc445ca74","Type":"ContainerStarted","Data":"7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd"} Oct 02 12:40:19 crc kubenswrapper[4658]: I1002 12:40:19.965837 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9hzp6" podStartSLOduration=3.46109183 podStartE2EDuration="18.965810709s" podCreationTimestamp="2025-10-02 12:40:01 +0000 UTC" firstStartedPulling="2025-10-02 12:40:03.751484772 +0000 UTC m=+4884.642638349" lastFinishedPulling="2025-10-02 12:40:19.256203631 +0000 UTC m=+4900.147357228" observedRunningTime="2025-10-02 12:40:19.955902981 +0000 UTC m=+4900.847056558" watchObservedRunningTime="2025-10-02 12:40:19.965810709 +0000 UTC m=+4900.856964316" Oct 02 12:40:22 crc kubenswrapper[4658]: I1002 12:40:22.429858 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:22 crc kubenswrapper[4658]: I1002 12:40:22.430262 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:23 crc kubenswrapper[4658]: I1002 12:40:23.490741 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9hzp6" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerName="registry-server" probeResult="failure" output=< Oct 02 12:40:23 crc kubenswrapper[4658]: timeout: failed to connect service ":50051" within 1s Oct 02 12:40:23 crc kubenswrapper[4658]: > Oct 02 12:40:32 crc kubenswrapper[4658]: I1002 12:40:32.512067 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:32 crc kubenswrapper[4658]: I1002 12:40:32.587973 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:32 crc kubenswrapper[4658]: I1002 12:40:32.760945 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9hzp6"] Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.094403 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9hzp6" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerName="registry-server" containerID="cri-o://7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd" gracePeriod=2 Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.643494 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.744602 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdzqt\" (UniqueName: \"kubernetes.io/projected/fb1a6bdc-cc68-4061-b203-94ffc445ca74-kube-api-access-bdzqt\") pod \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.744865 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-catalog-content\") pod \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.747774 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-utilities\") pod \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\" (UID: \"fb1a6bdc-cc68-4061-b203-94ffc445ca74\") " Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.750545 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-utilities" (OuterVolumeSpecName: "utilities") pod "fb1a6bdc-cc68-4061-b203-94ffc445ca74" (UID: "fb1a6bdc-cc68-4061-b203-94ffc445ca74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.751657 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.758847 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1a6bdc-cc68-4061-b203-94ffc445ca74-kube-api-access-bdzqt" (OuterVolumeSpecName: "kube-api-access-bdzqt") pod "fb1a6bdc-cc68-4061-b203-94ffc445ca74" (UID: "fb1a6bdc-cc68-4061-b203-94ffc445ca74"). InnerVolumeSpecName "kube-api-access-bdzqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.854150 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdzqt\" (UniqueName: \"kubernetes.io/projected/fb1a6bdc-cc68-4061-b203-94ffc445ca74-kube-api-access-bdzqt\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.875970 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb1a6bdc-cc68-4061-b203-94ffc445ca74" (UID: "fb1a6bdc-cc68-4061-b203-94ffc445ca74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:34 crc kubenswrapper[4658]: I1002 12:40:34.955118 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1a6bdc-cc68-4061-b203-94ffc445ca74-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.107903 4658 generic.go:334] "Generic (PLEG): container finished" podID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerID="7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd" exitCode=0 Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.107941 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hzp6" event={"ID":"fb1a6bdc-cc68-4061-b203-94ffc445ca74","Type":"ContainerDied","Data":"7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd"} Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.107966 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hzp6" event={"ID":"fb1a6bdc-cc68-4061-b203-94ffc445ca74","Type":"ContainerDied","Data":"12c25d42cdaac636a3b0e1c4d024e55b94f2ec2a50fc67078872feabf07bb9ed"} Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.107983 4658 scope.go:117] "RemoveContainer" containerID="7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.107986 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hzp6" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.132034 4658 scope.go:117] "RemoveContainer" containerID="7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.144874 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9hzp6"] Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.156753 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9hzp6"] Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.171123 4658 scope.go:117] "RemoveContainer" containerID="34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.200805 4658 scope.go:117] "RemoveContainer" containerID="7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd" Oct 02 12:40:35 crc kubenswrapper[4658]: E1002 12:40:35.201323 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd\": container with ID starting with 7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd not found: ID does not exist" containerID="7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.201376 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd"} err="failed to get container status \"7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd\": rpc error: code = NotFound desc = could not find container \"7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd\": container with ID starting with 7d5abd72d1bfd5caa93d1c745e9ee379b00e9851f2a729d4df5b5a21dc01f1bd not found: ID does not exist" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.201410 4658 scope.go:117] "RemoveContainer" containerID="7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c" Oct 02 12:40:35 crc kubenswrapper[4658]: E1002 12:40:35.201796 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c\": container with ID starting with 7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c not found: ID does not exist" containerID="7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.201833 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c"} err="failed to get container status \"7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c\": rpc error: code = NotFound desc = could not find container \"7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c\": container with ID starting with 7b309041973d23dfe4e50c5677a33b9254a9adadb52817dc7115fd1bfbe90b2c not found: ID does not exist" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.201858 4658 scope.go:117] "RemoveContainer" containerID="34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8" Oct 02 12:40:35 crc kubenswrapper[4658]: E1002 12:40:35.202173 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8\": container with ID starting with 34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8 not found: ID does not exist" containerID="34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.202234 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8"} err="failed to get container status \"34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8\": rpc error: code = NotFound desc = could not find container \"34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8\": container with ID starting with 34e25d1f462f527735e08d93de8e5f9ad5c3b2dad9e88387238200bb2b95bde8 not found: ID does not exist" Oct 02 12:40:35 crc kubenswrapper[4658]: I1002 12:40:35.962434 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" path="/var/lib/kubelet/pods/fb1a6bdc-cc68-4061-b203-94ffc445ca74/volumes" Oct 02 12:41:27 crc kubenswrapper[4658]: I1002 12:41:27.429530 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:41:27 crc kubenswrapper[4658]: I1002 12:41:27.430124 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:41:57 crc kubenswrapper[4658]: I1002 12:41:57.429900 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:41:57 crc kubenswrapper[4658]: I1002 12:41:57.431435 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:42:27 crc kubenswrapper[4658]: I1002 12:42:27.430810 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:42:27 crc kubenswrapper[4658]: I1002 12:42:27.431325 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:42:27 crc kubenswrapper[4658]: I1002 12:42:27.431378 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:42:27 crc kubenswrapper[4658]: I1002 12:42:27.432087 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:42:27 crc kubenswrapper[4658]: I1002 12:42:27.432137 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" gracePeriod=600 Oct 02 12:42:27 crc kubenswrapper[4658]: E1002 12:42:27.583438 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:42:28 crc kubenswrapper[4658]: I1002 12:42:28.312789 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" exitCode=0 Oct 02 12:42:28 crc kubenswrapper[4658]: I1002 12:42:28.312856 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86"} Oct 02 12:42:28 crc kubenswrapper[4658]: I1002 12:42:28.313352 4658 scope.go:117] "RemoveContainer" containerID="1c8a86971304afa5d0350daaf2e54a3f3944c3a6bd4d8c870317dd730d29eda5" Oct 02 12:42:28 crc kubenswrapper[4658]: I1002 12:42:28.314034 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:42:28 crc kubenswrapper[4658]: E1002 12:42:28.314374 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:42:39 crc kubenswrapper[4658]: I1002 12:42:39.958156 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:42:39 crc kubenswrapper[4658]: E1002 12:42:39.959052 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:42:52 crc kubenswrapper[4658]: I1002 12:42:52.949814 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:42:52 crc kubenswrapper[4658]: E1002 12:42:52.950535 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:43:04 crc kubenswrapper[4658]: I1002 12:43:04.950111 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:43:04 crc kubenswrapper[4658]: E1002 12:43:04.950812 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:43:19 crc kubenswrapper[4658]: I1002 12:43:19.967318 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:43:19 crc kubenswrapper[4658]: E1002 12:43:19.969178 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:43:34 crc kubenswrapper[4658]: I1002 12:43:34.949815 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:43:34 crc kubenswrapper[4658]: E1002 12:43:34.951059 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:43:48 crc kubenswrapper[4658]: I1002 12:43:48.949139 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:43:48 crc kubenswrapper[4658]: E1002 12:43:48.949956 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:44:03 crc kubenswrapper[4658]: I1002 12:44:03.949468 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:44:03 crc kubenswrapper[4658]: E1002 12:44:03.950284 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:44:17 crc kubenswrapper[4658]: I1002 12:44:17.949241 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:44:17 crc kubenswrapper[4658]: E1002 12:44:17.950452 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:44:30 crc kubenswrapper[4658]: I1002 12:44:30.948826 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:44:30 crc kubenswrapper[4658]: E1002 12:44:30.949561 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:44:42 crc kubenswrapper[4658]: I1002 12:44:42.949238 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:44:42 crc kubenswrapper[4658]: E1002 12:44:42.950694 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:44:55 crc kubenswrapper[4658]: I1002 12:44:55.949967 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:44:55 crc kubenswrapper[4658]: E1002 12:44:55.950920 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.151100 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926"] Oct 02 12:45:00 crc kubenswrapper[4658]: E1002 12:45:00.152538 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerName="registry-server" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.152558 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerName="registry-server" Oct 02 12:45:00 crc kubenswrapper[4658]: E1002 12:45:00.152582 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerName="extract-utilities" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.152590 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerName="extract-utilities" Oct 02 12:45:00 crc kubenswrapper[4658]: E1002 12:45:00.152606 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerName="extract-content" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.152614 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerName="extract-content" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.152988 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1a6bdc-cc68-4061-b203-94ffc445ca74" containerName="registry-server" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.153847 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.156841 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.160424 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.162980 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926"] Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.289679 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-secret-volume\") pod \"collect-profiles-29323485-jc926\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.289863 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-config-volume\") pod \"collect-profiles-29323485-jc926\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.289905 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj6xt\" (UniqueName: \"kubernetes.io/projected/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-kube-api-access-jj6xt\") pod \"collect-profiles-29323485-jc926\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.391524 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj6xt\" (UniqueName: \"kubernetes.io/projected/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-kube-api-access-jj6xt\") pod \"collect-profiles-29323485-jc926\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.391665 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-secret-volume\") pod \"collect-profiles-29323485-jc926\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.391851 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-config-volume\") pod \"collect-profiles-29323485-jc926\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.392824 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-config-volume\") pod \"collect-profiles-29323485-jc926\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.397418 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-secret-volume\") pod \"collect-profiles-29323485-jc926\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.407527 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj6xt\" (UniqueName: \"kubernetes.io/projected/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-kube-api-access-jj6xt\") pod \"collect-profiles-29323485-jc926\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.491698 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:00 crc kubenswrapper[4658]: I1002 12:45:00.965324 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926"] Oct 02 12:45:01 crc kubenswrapper[4658]: I1002 12:45:01.774830 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" event={"ID":"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296","Type":"ContainerStarted","Data":"cbb07f4f2ccb3d6fdadfe7615afecbbd371383e8a01b53cfefd22efe88194d50"} Oct 02 12:45:02 crc kubenswrapper[4658]: I1002 12:45:02.787188 4658 generic.go:334] "Generic (PLEG): container finished" podID="e5e472b8-d8c1-4788-a8a0-d86b3bf7d296" containerID="f942e55b4401911d77a346060d7f0939c3765bfe87d4382edc7c17d0e2b9ce52" exitCode=0 Oct 02 12:45:02 crc kubenswrapper[4658]: I1002 12:45:02.787345 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" event={"ID":"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296","Type":"ContainerDied","Data":"f942e55b4401911d77a346060d7f0939c3765bfe87d4382edc7c17d0e2b9ce52"} Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.172721 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.270626 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj6xt\" (UniqueName: \"kubernetes.io/projected/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-kube-api-access-jj6xt\") pod \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.270926 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-secret-volume\") pod \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.271014 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-config-volume\") pod \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\" (UID: \"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296\") " Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.272609 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-config-volume" (OuterVolumeSpecName: "config-volume") pod "e5e472b8-d8c1-4788-a8a0-d86b3bf7d296" (UID: "e5e472b8-d8c1-4788-a8a0-d86b3bf7d296"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.278987 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-kube-api-access-jj6xt" (OuterVolumeSpecName: "kube-api-access-jj6xt") pod "e5e472b8-d8c1-4788-a8a0-d86b3bf7d296" (UID: "e5e472b8-d8c1-4788-a8a0-d86b3bf7d296"). InnerVolumeSpecName "kube-api-access-jj6xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.280523 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e5e472b8-d8c1-4788-a8a0-d86b3bf7d296" (UID: "e5e472b8-d8c1-4788-a8a0-d86b3bf7d296"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.374151 4658 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.374199 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj6xt\" (UniqueName: \"kubernetes.io/projected/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-kube-api-access-jj6xt\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.374212 4658 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e472b8-d8c1-4788-a8a0-d86b3bf7d296-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.808436 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" event={"ID":"e5e472b8-d8c1-4788-a8a0-d86b3bf7d296","Type":"ContainerDied","Data":"cbb07f4f2ccb3d6fdadfe7615afecbbd371383e8a01b53cfefd22efe88194d50"} Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.808803 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbb07f4f2ccb3d6fdadfe7615afecbbd371383e8a01b53cfefd22efe88194d50" Oct 02 12:45:04 crc kubenswrapper[4658]: I1002 12:45:04.808473 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jc926" Oct 02 12:45:05 crc kubenswrapper[4658]: I1002 12:45:05.243307 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58"] Oct 02 12:45:05 crc kubenswrapper[4658]: I1002 12:45:05.250448 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-b6k58"] Oct 02 12:45:05 crc kubenswrapper[4658]: I1002 12:45:05.960138 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e476dd-9cf3-4c98-8005-3e822bfc1053" path="/var/lib/kubelet/pods/d7e476dd-9cf3-4c98-8005-3e822bfc1053/volumes" Oct 02 12:45:07 crc kubenswrapper[4658]: I1002 12:45:07.950854 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:45:07 crc kubenswrapper[4658]: E1002 12:45:07.951539 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:45:22 crc kubenswrapper[4658]: I1002 12:45:22.949923 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:45:22 crc kubenswrapper[4658]: E1002 12:45:22.950937 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:45:25 crc kubenswrapper[4658]: I1002 12:45:25.215815 4658 scope.go:117] "RemoveContainer" containerID="e125a5c53ca68207489e4af241f5b192d51ffe20153b53733ab606af0810f7bd" Oct 02 12:45:37 crc kubenswrapper[4658]: I1002 12:45:37.950507 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:45:37 crc kubenswrapper[4658]: E1002 12:45:37.951554 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:45:48 crc kubenswrapper[4658]: I1002 12:45:48.949627 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:45:48 crc kubenswrapper[4658]: E1002 12:45:48.950937 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:46:01 crc kubenswrapper[4658]: I1002 12:46:01.949350 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:46:01 crc kubenswrapper[4658]: E1002 12:46:01.950525 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:46:13 crc kubenswrapper[4658]: I1002 12:46:13.949060 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:46:13 crc kubenswrapper[4658]: E1002 12:46:13.950039 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:46:24 crc kubenswrapper[4658]: I1002 12:46:24.948999 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:46:24 crc kubenswrapper[4658]: E1002 12:46:24.950164 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:46:39 crc kubenswrapper[4658]: I1002 12:46:39.962073 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:46:39 crc kubenswrapper[4658]: E1002 12:46:39.964616 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:46:52 crc kubenswrapper[4658]: I1002 12:46:52.949929 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:46:52 crc kubenswrapper[4658]: E1002 12:46:52.951262 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:47:06 crc kubenswrapper[4658]: I1002 12:47:06.948945 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:47:06 crc kubenswrapper[4658]: E1002 12:47:06.949935 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:47:17 crc kubenswrapper[4658]: I1002 12:47:17.950997 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:47:17 crc kubenswrapper[4658]: E1002 12:47:17.951773 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:47:32 crc kubenswrapper[4658]: I1002 12:47:32.949440 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:47:33 crc kubenswrapper[4658]: I1002 12:47:33.500625 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"8451f58522e374a93bce41d5d41b5de7d84687f96c2fae95ace55db12fae10c6"} Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.509699 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qptv5"] Oct 02 12:48:07 crc kubenswrapper[4658]: E1002 12:48:07.510655 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e472b8-d8c1-4788-a8a0-d86b3bf7d296" containerName="collect-profiles" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.510674 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e472b8-d8c1-4788-a8a0-d86b3bf7d296" containerName="collect-profiles" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.510943 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e472b8-d8c1-4788-a8a0-d86b3bf7d296" containerName="collect-profiles" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.512459 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.556389 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qptv5"] Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.695070 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-catalog-content\") pod \"certified-operators-qptv5\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.695569 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-utilities\") pod \"certified-operators-qptv5\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.695719 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78dbh\" (UniqueName: \"kubernetes.io/projected/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-kube-api-access-78dbh\") pod \"certified-operators-qptv5\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.797140 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-utilities\") pod \"certified-operators-qptv5\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.797704 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78dbh\" (UniqueName: \"kubernetes.io/projected/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-kube-api-access-78dbh\") pod \"certified-operators-qptv5\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.797897 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-utilities\") pod \"certified-operators-qptv5\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.798656 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-catalog-content\") pod \"certified-operators-qptv5\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.799087 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-catalog-content\") pod \"certified-operators-qptv5\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.820183 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78dbh\" (UniqueName: \"kubernetes.io/projected/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-kube-api-access-78dbh\") pod \"certified-operators-qptv5\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:07 crc kubenswrapper[4658]: I1002 12:48:07.864835 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:08 crc kubenswrapper[4658]: I1002 12:48:08.441522 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qptv5"] Oct 02 12:48:08 crc kubenswrapper[4658]: I1002 12:48:08.858008 4658 generic.go:334] "Generic (PLEG): container finished" podID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerID="0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496" exitCode=0 Oct 02 12:48:08 crc kubenswrapper[4658]: I1002 12:48:08.858056 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qptv5" event={"ID":"fa80fff4-cd71-4d73-bdcc-93a730d68b2c","Type":"ContainerDied","Data":"0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496"} Oct 02 12:48:08 crc kubenswrapper[4658]: I1002 12:48:08.858091 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qptv5" event={"ID":"fa80fff4-cd71-4d73-bdcc-93a730d68b2c","Type":"ContainerStarted","Data":"05dcfdf69a7cb1062604db8712a4ce05aa94a02581cebd3f4a9a96269d8b52cd"} Oct 02 12:48:08 crc kubenswrapper[4658]: I1002 12:48:08.860639 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:48:10 crc kubenswrapper[4658]: I1002 12:48:10.882216 4658 generic.go:334] "Generic (PLEG): container finished" podID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerID="27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e" exitCode=0 Oct 02 12:48:10 crc kubenswrapper[4658]: I1002 12:48:10.882273 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qptv5" event={"ID":"fa80fff4-cd71-4d73-bdcc-93a730d68b2c","Type":"ContainerDied","Data":"27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e"} Oct 02 12:48:11 crc kubenswrapper[4658]: I1002 12:48:11.894177 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qptv5" event={"ID":"fa80fff4-cd71-4d73-bdcc-93a730d68b2c","Type":"ContainerStarted","Data":"f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9"} Oct 02 12:48:11 crc kubenswrapper[4658]: I1002 12:48:11.914414 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qptv5" podStartSLOduration=2.470876196 podStartE2EDuration="4.914382169s" podCreationTimestamp="2025-10-02 12:48:07 +0000 UTC" firstStartedPulling="2025-10-02 12:48:08.860252801 +0000 UTC m=+5369.751406378" lastFinishedPulling="2025-10-02 12:48:11.303758744 +0000 UTC m=+5372.194912351" observedRunningTime="2025-10-02 12:48:11.910878557 +0000 UTC m=+5372.802032144" watchObservedRunningTime="2025-10-02 12:48:11.914382169 +0000 UTC m=+5372.805535736" Oct 02 12:48:17 crc kubenswrapper[4658]: I1002 12:48:17.865345 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:17 crc kubenswrapper[4658]: I1002 12:48:17.865817 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:17 crc kubenswrapper[4658]: I1002 12:48:17.962648 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:18 crc kubenswrapper[4658]: I1002 12:48:18.020776 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:18 crc kubenswrapper[4658]: I1002 12:48:18.205415 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qptv5"] Oct 02 12:48:19 crc kubenswrapper[4658]: I1002 12:48:19.973541 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qptv5" podUID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerName="registry-server" containerID="cri-o://f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9" gracePeriod=2 Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.575596 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.677169 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-catalog-content\") pod \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.677232 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78dbh\" (UniqueName: \"kubernetes.io/projected/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-kube-api-access-78dbh\") pod \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.677325 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-utilities\") pod \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\" (UID: \"fa80fff4-cd71-4d73-bdcc-93a730d68b2c\") " Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.678237 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-utilities" (OuterVolumeSpecName: "utilities") pod "fa80fff4-cd71-4d73-bdcc-93a730d68b2c" (UID: "fa80fff4-cd71-4d73-bdcc-93a730d68b2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.684777 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-kube-api-access-78dbh" (OuterVolumeSpecName: "kube-api-access-78dbh") pod "fa80fff4-cd71-4d73-bdcc-93a730d68b2c" (UID: "fa80fff4-cd71-4d73-bdcc-93a730d68b2c"). InnerVolumeSpecName "kube-api-access-78dbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.728338 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa80fff4-cd71-4d73-bdcc-93a730d68b2c" (UID: "fa80fff4-cd71-4d73-bdcc-93a730d68b2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.779607 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.779639 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78dbh\" (UniqueName: \"kubernetes.io/projected/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-kube-api-access-78dbh\") on node \"crc\" DevicePath \"\"" Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.779651 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80fff4-cd71-4d73-bdcc-93a730d68b2c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.985262 4658 generic.go:334] "Generic (PLEG): container finished" podID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerID="f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9" exitCode=0 Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.985345 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qptv5" event={"ID":"fa80fff4-cd71-4d73-bdcc-93a730d68b2c","Type":"ContainerDied","Data":"f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9"} Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.986561 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qptv5" event={"ID":"fa80fff4-cd71-4d73-bdcc-93a730d68b2c","Type":"ContainerDied","Data":"05dcfdf69a7cb1062604db8712a4ce05aa94a02581cebd3f4a9a96269d8b52cd"} Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.985401 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qptv5" Oct 02 12:48:20 crc kubenswrapper[4658]: I1002 12:48:20.986602 4658 scope.go:117] "RemoveContainer" containerID="f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9" Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.008602 4658 scope.go:117] "RemoveContainer" containerID="27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e" Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.022971 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qptv5"] Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.037395 4658 scope.go:117] "RemoveContainer" containerID="0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496" Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.040677 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qptv5"] Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.099660 4658 scope.go:117] "RemoveContainer" containerID="f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9" Oct 02 12:48:21 crc kubenswrapper[4658]: E1002 12:48:21.100112 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9\": container with ID starting with f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9 not found: ID does not exist" containerID="f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9" Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.100275 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9"} err="failed to get container status \"f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9\": rpc error: code = NotFound desc = could not find container \"f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9\": container with ID starting with f4adcad4fbd8e5f7e93fa265ae24716f64cafce75a719506072a4fe59dfdd6a9 not found: ID does not exist" Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.100419 4658 scope.go:117] "RemoveContainer" containerID="27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e" Oct 02 12:48:21 crc kubenswrapper[4658]: E1002 12:48:21.101014 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e\": container with ID starting with 27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e not found: ID does not exist" containerID="27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e" Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.101051 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e"} err="failed to get container status \"27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e\": rpc error: code = NotFound desc = could not find container \"27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e\": container with ID starting with 27dc65297c9a2e05b7e342c727423163a90964dc2e9fa395baa9e3d108b5bd4e not found: ID does not exist" Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.101092 4658 scope.go:117] "RemoveContainer" containerID="0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496" Oct 02 12:48:21 crc kubenswrapper[4658]: E1002 12:48:21.101402 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496\": container with ID starting with 0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496 not found: ID does not exist" containerID="0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496" Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.101472 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496"} err="failed to get container status \"0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496\": rpc error: code = NotFound desc = could not find container \"0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496\": container with ID starting with 0200a21c71615005b0d3f04aef87bd855f6628dae3612182fd1db9b6486cd496 not found: ID does not exist" Oct 02 12:48:21 crc kubenswrapper[4658]: I1002 12:48:21.959010 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" path="/var/lib/kubelet/pods/fa80fff4-cd71-4d73-bdcc-93a730d68b2c/volumes" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.102697 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ds8p7"] Oct 02 12:48:29 crc kubenswrapper[4658]: E1002 12:48:29.104956 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerName="extract-content" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.105010 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerName="extract-content" Oct 02 12:48:29 crc kubenswrapper[4658]: E1002 12:48:29.105029 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerName="extract-utilities" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.105039 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerName="extract-utilities" Oct 02 12:48:29 crc kubenswrapper[4658]: E1002 12:48:29.105071 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerName="registry-server" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.105078 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerName="registry-server" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.105272 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa80fff4-cd71-4d73-bdcc-93a730d68b2c" containerName="registry-server" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.106760 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.113054 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds8p7"] Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.244497 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hlnb\" (UniqueName: \"kubernetes.io/projected/9d72b3c5-529c-4d5d-88da-2538a42277b6-kube-api-access-9hlnb\") pod \"redhat-marketplace-ds8p7\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.244869 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-catalog-content\") pod \"redhat-marketplace-ds8p7\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.244916 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-utilities\") pod \"redhat-marketplace-ds8p7\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.347134 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hlnb\" (UniqueName: \"kubernetes.io/projected/9d72b3c5-529c-4d5d-88da-2538a42277b6-kube-api-access-9hlnb\") pod \"redhat-marketplace-ds8p7\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.347249 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-catalog-content\") pod \"redhat-marketplace-ds8p7\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.347322 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-utilities\") pod \"redhat-marketplace-ds8p7\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.347888 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-utilities\") pod \"redhat-marketplace-ds8p7\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.348486 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-catalog-content\") pod \"redhat-marketplace-ds8p7\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.372370 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hlnb\" (UniqueName: \"kubernetes.io/projected/9d72b3c5-529c-4d5d-88da-2538a42277b6-kube-api-access-9hlnb\") pod \"redhat-marketplace-ds8p7\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.472884 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:29 crc kubenswrapper[4658]: W1002 12:48:29.948241 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d72b3c5_529c_4d5d_88da_2538a42277b6.slice/crio-c478bcd2a4af416835c69532d76bb0ac7d55ed4f58a6c0ffec992f5dd7752502 WatchSource:0}: Error finding container c478bcd2a4af416835c69532d76bb0ac7d55ed4f58a6c0ffec992f5dd7752502: Status 404 returned error can't find the container with id c478bcd2a4af416835c69532d76bb0ac7d55ed4f58a6c0ffec992f5dd7752502 Oct 02 12:48:29 crc kubenswrapper[4658]: I1002 12:48:29.965462 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds8p7"] Oct 02 12:48:30 crc kubenswrapper[4658]: I1002 12:48:30.089981 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds8p7" event={"ID":"9d72b3c5-529c-4d5d-88da-2538a42277b6","Type":"ContainerStarted","Data":"c478bcd2a4af416835c69532d76bb0ac7d55ed4f58a6c0ffec992f5dd7752502"} Oct 02 12:48:31 crc kubenswrapper[4658]: I1002 12:48:31.103713 4658 generic.go:334] "Generic (PLEG): container finished" podID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerID="b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee" exitCode=0 Oct 02 12:48:31 crc kubenswrapper[4658]: I1002 12:48:31.103810 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds8p7" event={"ID":"9d72b3c5-529c-4d5d-88da-2538a42277b6","Type":"ContainerDied","Data":"b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee"} Oct 02 12:48:32 crc kubenswrapper[4658]: I1002 12:48:32.114210 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds8p7" event={"ID":"9d72b3c5-529c-4d5d-88da-2538a42277b6","Type":"ContainerStarted","Data":"f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee"} Oct 02 12:48:33 crc kubenswrapper[4658]: I1002 12:48:33.123901 4658 generic.go:334] "Generic (PLEG): container finished" podID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerID="f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee" exitCode=0 Oct 02 12:48:33 crc kubenswrapper[4658]: I1002 12:48:33.124154 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds8p7" event={"ID":"9d72b3c5-529c-4d5d-88da-2538a42277b6","Type":"ContainerDied","Data":"f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee"} Oct 02 12:48:34 crc kubenswrapper[4658]: I1002 12:48:34.135040 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds8p7" event={"ID":"9d72b3c5-529c-4d5d-88da-2538a42277b6","Type":"ContainerStarted","Data":"dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee"} Oct 02 12:48:34 crc kubenswrapper[4658]: I1002 12:48:34.158631 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ds8p7" podStartSLOduration=2.521791132 podStartE2EDuration="5.158614239s" podCreationTimestamp="2025-10-02 12:48:29 +0000 UTC" firstStartedPulling="2025-10-02 12:48:31.107627862 +0000 UTC m=+5391.998781429" lastFinishedPulling="2025-10-02 12:48:33.744450929 +0000 UTC m=+5394.635604536" observedRunningTime="2025-10-02 12:48:34.156545323 +0000 UTC m=+5395.047698900" watchObservedRunningTime="2025-10-02 12:48:34.158614239 +0000 UTC m=+5395.049767806" Oct 02 12:48:39 crc kubenswrapper[4658]: I1002 12:48:39.473454 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:39 crc kubenswrapper[4658]: I1002 12:48:39.474106 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:39 crc kubenswrapper[4658]: I1002 12:48:39.550731 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:40 crc kubenswrapper[4658]: I1002 12:48:40.276538 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:40 crc kubenswrapper[4658]: I1002 12:48:40.319463 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds8p7"] Oct 02 12:48:42 crc kubenswrapper[4658]: I1002 12:48:42.234451 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ds8p7" podUID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerName="registry-server" containerID="cri-o://dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee" gracePeriod=2 Oct 02 12:48:42 crc kubenswrapper[4658]: I1002 12:48:42.781233 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:42 crc kubenswrapper[4658]: I1002 12:48:42.944235 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-catalog-content\") pod \"9d72b3c5-529c-4d5d-88da-2538a42277b6\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " Oct 02 12:48:42 crc kubenswrapper[4658]: I1002 12:48:42.944597 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-utilities\") pod \"9d72b3c5-529c-4d5d-88da-2538a42277b6\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " Oct 02 12:48:42 crc kubenswrapper[4658]: I1002 12:48:42.944649 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hlnb\" (UniqueName: \"kubernetes.io/projected/9d72b3c5-529c-4d5d-88da-2538a42277b6-kube-api-access-9hlnb\") pod \"9d72b3c5-529c-4d5d-88da-2538a42277b6\" (UID: \"9d72b3c5-529c-4d5d-88da-2538a42277b6\") " Oct 02 12:48:42 crc kubenswrapper[4658]: I1002 12:48:42.945627 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-utilities" (OuterVolumeSpecName: "utilities") pod "9d72b3c5-529c-4d5d-88da-2538a42277b6" (UID: "9d72b3c5-529c-4d5d-88da-2538a42277b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:48:42 crc kubenswrapper[4658]: I1002 12:48:42.950154 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d72b3c5-529c-4d5d-88da-2538a42277b6-kube-api-access-9hlnb" (OuterVolumeSpecName: "kube-api-access-9hlnb") pod "9d72b3c5-529c-4d5d-88da-2538a42277b6" (UID: "9d72b3c5-529c-4d5d-88da-2538a42277b6"). InnerVolumeSpecName "kube-api-access-9hlnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:48:42 crc kubenswrapper[4658]: I1002 12:48:42.965896 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d72b3c5-529c-4d5d-88da-2538a42277b6" (UID: "9d72b3c5-529c-4d5d-88da-2538a42277b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.046812 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.046847 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72b3c5-529c-4d5d-88da-2538a42277b6-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.046857 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hlnb\" (UniqueName: \"kubernetes.io/projected/9d72b3c5-529c-4d5d-88da-2538a42277b6-kube-api-access-9hlnb\") on node \"crc\" DevicePath \"\"" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.246964 4658 generic.go:334] "Generic (PLEG): container finished" podID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerID="dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee" exitCode=0 Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.247016 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds8p7" event={"ID":"9d72b3c5-529c-4d5d-88da-2538a42277b6","Type":"ContainerDied","Data":"dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee"} Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.247050 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds8p7" event={"ID":"9d72b3c5-529c-4d5d-88da-2538a42277b6","Type":"ContainerDied","Data":"c478bcd2a4af416835c69532d76bb0ac7d55ed4f58a6c0ffec992f5dd7752502"} Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.247074 4658 scope.go:117] "RemoveContainer" containerID="dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.247260 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds8p7" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.276755 4658 scope.go:117] "RemoveContainer" containerID="f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.281371 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds8p7"] Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.291565 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds8p7"] Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.305806 4658 scope.go:117] "RemoveContainer" containerID="b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.377076 4658 scope.go:117] "RemoveContainer" containerID="dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee" Oct 02 12:48:43 crc kubenswrapper[4658]: E1002 12:48:43.377497 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee\": container with ID starting with dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee not found: ID does not exist" containerID="dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.377531 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee"} err="failed to get container status \"dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee\": rpc error: code = NotFound desc = could not find container \"dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee\": container with ID starting with dcdb016856a60eaca1d610df0d650d8db7cf001acece88d543e6637d5a6f87ee not found: ID does not exist" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.377551 4658 scope.go:117] "RemoveContainer" containerID="f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee" Oct 02 12:48:43 crc kubenswrapper[4658]: E1002 12:48:43.377735 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee\": container with ID starting with f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee not found: ID does not exist" containerID="f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.377757 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee"} err="failed to get container status \"f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee\": rpc error: code = NotFound desc = could not find container \"f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee\": container with ID starting with f7c0e2bd65ed4851e83242ee261b2d54845547614f503ad58fb1e011f00277ee not found: ID does not exist" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.377769 4658 scope.go:117] "RemoveContainer" containerID="b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee" Oct 02 12:48:43 crc kubenswrapper[4658]: E1002 12:48:43.377930 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee\": container with ID starting with b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee not found: ID does not exist" containerID="b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.377950 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee"} err="failed to get container status \"b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee\": rpc error: code = NotFound desc = could not find container \"b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee\": container with ID starting with b70b8e942d9a1b0cbe531f99f755773e64357a29de00dcc9c0d52618e7e8f8ee not found: ID does not exist" Oct 02 12:48:43 crc kubenswrapper[4658]: I1002 12:48:43.962607 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d72b3c5-529c-4d5d-88da-2538a42277b6" path="/var/lib/kubelet/pods/9d72b3c5-529c-4d5d-88da-2538a42277b6/volumes" Oct 02 12:49:46 crc kubenswrapper[4658]: I1002 12:49:46.880187 4658 generic.go:334] "Generic (PLEG): container finished" podID="fd9ceedd-f5a7-425a-9112-998edc1d3e00" containerID="2eeff51e8f7d15bc6e32c95d28eda08c055da1b1e799602917a584e943d080f9" exitCode=0 Oct 02 12:49:46 crc kubenswrapper[4658]: I1002 12:49:46.880279 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fd9ceedd-f5a7-425a-9112-998edc1d3e00","Type":"ContainerDied","Data":"2eeff51e8f7d15bc6e32c95d28eda08c055da1b1e799602917a584e943d080f9"} Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.310619 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.384846 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ca-certs\") pod \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.384939 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config-secret\") pod \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.384990 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-config-data\") pod \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.385080 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcb8r\" (UniqueName: \"kubernetes.io/projected/fd9ceedd-f5a7-425a-9112-998edc1d3e00-kube-api-access-hcb8r\") pod \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.385156 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.385249 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-temporary\") pod \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.385276 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ssh-key\") pod \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.385333 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-workdir\") pod \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.385358 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config\") pod \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\" (UID: \"fd9ceedd-f5a7-425a-9112-998edc1d3e00\") " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.387410 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "fd9ceedd-f5a7-425a-9112-998edc1d3e00" (UID: "fd9ceedd-f5a7-425a-9112-998edc1d3e00"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.387603 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-config-data" (OuterVolumeSpecName: "config-data") pod "fd9ceedd-f5a7-425a-9112-998edc1d3e00" (UID: "fd9ceedd-f5a7-425a-9112-998edc1d3e00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.399849 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9ceedd-f5a7-425a-9112-998edc1d3e00-kube-api-access-hcb8r" (OuterVolumeSpecName: "kube-api-access-hcb8r") pod "fd9ceedd-f5a7-425a-9112-998edc1d3e00" (UID: "fd9ceedd-f5a7-425a-9112-998edc1d3e00"). InnerVolumeSpecName "kube-api-access-hcb8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.405891 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "fd9ceedd-f5a7-425a-9112-998edc1d3e00" (UID: "fd9ceedd-f5a7-425a-9112-998edc1d3e00"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.413028 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "fd9ceedd-f5a7-425a-9112-998edc1d3e00" (UID: "fd9ceedd-f5a7-425a-9112-998edc1d3e00"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.427024 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fd9ceedd-f5a7-425a-9112-998edc1d3e00" (UID: "fd9ceedd-f5a7-425a-9112-998edc1d3e00"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.432310 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fd9ceedd-f5a7-425a-9112-998edc1d3e00" (UID: "fd9ceedd-f5a7-425a-9112-998edc1d3e00"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.452841 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fd9ceedd-f5a7-425a-9112-998edc1d3e00" (UID: "fd9ceedd-f5a7-425a-9112-998edc1d3e00"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.488243 4658 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.488276 4658 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.488287 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.488393 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcb8r\" (UniqueName: \"kubernetes.io/projected/fd9ceedd-f5a7-425a-9112-998edc1d3e00-kube-api-access-hcb8r\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.488423 4658 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.488434 4658 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.488443 4658 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd9ceedd-f5a7-425a-9112-998edc1d3e00-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.488452 4658 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd9ceedd-f5a7-425a-9112-998edc1d3e00-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.508336 4658 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.590559 4658 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.761714 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "fd9ceedd-f5a7-425a-9112-998edc1d3e00" (UID: "fd9ceedd-f5a7-425a-9112-998edc1d3e00"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.793962 4658 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fd9ceedd-f5a7-425a-9112-998edc1d3e00-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.903782 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fd9ceedd-f5a7-425a-9112-998edc1d3e00","Type":"ContainerDied","Data":"6f8ecffc94a678d95013b478002df36759405cc22d0725995edecf9f36328ee7"} Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.903832 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8ecffc94a678d95013b478002df36759405cc22d0725995edecf9f36328ee7" Oct 02 12:49:48 crc kubenswrapper[4658]: I1002 12:49:48.903892 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 12:49:57 crc kubenswrapper[4658]: I1002 12:49:57.430009 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:49:57 crc kubenswrapper[4658]: I1002 12:49:57.430778 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.592501 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 12:49:59 crc kubenswrapper[4658]: E1002 12:49:59.594631 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerName="extract-utilities" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.594683 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerName="extract-utilities" Oct 02 12:49:59 crc kubenswrapper[4658]: E1002 12:49:59.594713 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9ceedd-f5a7-425a-9112-998edc1d3e00" containerName="tempest-tests-tempest-tests-runner" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.594753 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9ceedd-f5a7-425a-9112-998edc1d3e00" containerName="tempest-tests-tempest-tests-runner" Oct 02 12:49:59 crc kubenswrapper[4658]: E1002 12:49:59.594871 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerName="registry-server" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.594894 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerName="registry-server" Oct 02 12:49:59 crc kubenswrapper[4658]: E1002 12:49:59.594927 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerName="extract-content" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.594944 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerName="extract-content" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.596545 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d72b3c5-529c-4d5d-88da-2538a42277b6" containerName="registry-server" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.596613 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9ceedd-f5a7-425a-9112-998edc1d3e00" containerName="tempest-tests-tempest-tests-runner" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.598072 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.601416 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kt87g" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.604725 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.743219 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67njj\" (UniqueName: \"kubernetes.io/projected/9fb066d3-ce67-4635-bebd-2e24da16a2a8-kube-api-access-67njj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9fb066d3-ce67-4635-bebd-2e24da16a2a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.743587 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9fb066d3-ce67-4635-bebd-2e24da16a2a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.845006 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67njj\" (UniqueName: \"kubernetes.io/projected/9fb066d3-ce67-4635-bebd-2e24da16a2a8-kube-api-access-67njj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9fb066d3-ce67-4635-bebd-2e24da16a2a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.845192 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9fb066d3-ce67-4635-bebd-2e24da16a2a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.845723 4658 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9fb066d3-ce67-4635-bebd-2e24da16a2a8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.873853 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67njj\" (UniqueName: \"kubernetes.io/projected/9fb066d3-ce67-4635-bebd-2e24da16a2a8-kube-api-access-67njj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9fb066d3-ce67-4635-bebd-2e24da16a2a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.878284 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9fb066d3-ce67-4635-bebd-2e24da16a2a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:49:59 crc kubenswrapper[4658]: I1002 12:49:59.927382 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:50:00 crc kubenswrapper[4658]: I1002 12:50:00.357096 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 12:50:01 crc kubenswrapper[4658]: I1002 12:50:01.041675 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9fb066d3-ce67-4635-bebd-2e24da16a2a8","Type":"ContainerStarted","Data":"48e1adcc129c67a9d92b9ce109b25209f67eff38589b0b2000991185f921dc98"} Oct 02 12:50:02 crc kubenswrapper[4658]: I1002 12:50:02.053953 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9fb066d3-ce67-4635-bebd-2e24da16a2a8","Type":"ContainerStarted","Data":"6b7222727db99075ac003dc78c4455df330f3e58aa6e21f04d413414e5c18bed"} Oct 02 12:50:02 crc kubenswrapper[4658]: I1002 12:50:02.081707 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.02791203 podStartE2EDuration="3.081680273s" podCreationTimestamp="2025-10-02 12:49:59 +0000 UTC" firstStartedPulling="2025-10-02 12:50:00.374360902 +0000 UTC m=+5481.265514479" lastFinishedPulling="2025-10-02 12:50:01.428129155 +0000 UTC m=+5482.319282722" observedRunningTime="2025-10-02 12:50:02.074263176 +0000 UTC m=+5482.965416753" watchObservedRunningTime="2025-10-02 12:50:02.081680273 +0000 UTC m=+5482.972833870" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.442567 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mtlb6/must-gather-fkzcz"] Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.444597 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.446129 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mtlb6"/"kube-root-ca.crt" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.446381 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mtlb6"/"openshift-service-ca.crt" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.446509 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mtlb6"/"default-dockercfg-5d6pw" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.454524 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mtlb6/must-gather-fkzcz"] Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.533026 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx84m\" (UniqueName: \"kubernetes.io/projected/64073b89-1a4e-4ef4-b876-f24d3148632c-kube-api-access-jx84m\") pod \"must-gather-fkzcz\" (UID: \"64073b89-1a4e-4ef4-b876-f24d3148632c\") " pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.533570 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64073b89-1a4e-4ef4-b876-f24d3148632c-must-gather-output\") pod \"must-gather-fkzcz\" (UID: \"64073b89-1a4e-4ef4-b876-f24d3148632c\") " pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.635979 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx84m\" (UniqueName: \"kubernetes.io/projected/64073b89-1a4e-4ef4-b876-f24d3148632c-kube-api-access-jx84m\") pod \"must-gather-fkzcz\" (UID: \"64073b89-1a4e-4ef4-b876-f24d3148632c\") " pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.636385 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64073b89-1a4e-4ef4-b876-f24d3148632c-must-gather-output\") pod \"must-gather-fkzcz\" (UID: \"64073b89-1a4e-4ef4-b876-f24d3148632c\") " pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.636789 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64073b89-1a4e-4ef4-b876-f24d3148632c-must-gather-output\") pod \"must-gather-fkzcz\" (UID: \"64073b89-1a4e-4ef4-b876-f24d3148632c\") " pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.667216 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx84m\" (UniqueName: \"kubernetes.io/projected/64073b89-1a4e-4ef4-b876-f24d3148632c-kube-api-access-jx84m\") pod \"must-gather-fkzcz\" (UID: \"64073b89-1a4e-4ef4-b876-f24d3148632c\") " pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:50:22 crc kubenswrapper[4658]: I1002 12:50:22.765673 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:50:23 crc kubenswrapper[4658]: I1002 12:50:23.378881 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mtlb6/must-gather-fkzcz"] Oct 02 12:50:24 crc kubenswrapper[4658]: I1002 12:50:24.283120 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" event={"ID":"64073b89-1a4e-4ef4-b876-f24d3148632c","Type":"ContainerStarted","Data":"c1e3aaa149b1dcad59db962f5f8b2f4cd3976a7d4cf27ecc17c73b092dfa64b8"} Oct 02 12:50:27 crc kubenswrapper[4658]: I1002 12:50:27.430524 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:50:27 crc kubenswrapper[4658]: I1002 12:50:27.430826 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:50:31 crc kubenswrapper[4658]: I1002 12:50:31.378372 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" event={"ID":"64073b89-1a4e-4ef4-b876-f24d3148632c","Type":"ContainerStarted","Data":"75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c"} Oct 02 12:50:32 crc kubenswrapper[4658]: I1002 12:50:32.396981 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" event={"ID":"64073b89-1a4e-4ef4-b876-f24d3148632c","Type":"ContainerStarted","Data":"b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0"} Oct 02 12:50:32 crc kubenswrapper[4658]: I1002 12:50:32.421533 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" podStartSLOduration=2.80376769 podStartE2EDuration="10.421503577s" podCreationTimestamp="2025-10-02 12:50:22 +0000 UTC" firstStartedPulling="2025-10-02 12:50:23.388270065 +0000 UTC m=+5504.279423632" lastFinishedPulling="2025-10-02 12:50:31.006005952 +0000 UTC m=+5511.897159519" observedRunningTime="2025-10-02 12:50:32.413689307 +0000 UTC m=+5513.304842904" watchObservedRunningTime="2025-10-02 12:50:32.421503577 +0000 UTC m=+5513.312657144" Oct 02 12:50:40 crc kubenswrapper[4658]: I1002 12:50:40.359583 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mtlb6/crc-debug-6kcd5"] Oct 02 12:50:40 crc kubenswrapper[4658]: I1002 12:50:40.361797 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:50:40 crc kubenswrapper[4658]: I1002 12:50:40.409574 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bphtm\" (UniqueName: \"kubernetes.io/projected/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-kube-api-access-bphtm\") pod \"crc-debug-6kcd5\" (UID: \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\") " pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:50:40 crc kubenswrapper[4658]: I1002 12:50:40.409678 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-host\") pod \"crc-debug-6kcd5\" (UID: \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\") " pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:50:40 crc kubenswrapper[4658]: I1002 12:50:40.512201 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bphtm\" (UniqueName: \"kubernetes.io/projected/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-kube-api-access-bphtm\") pod \"crc-debug-6kcd5\" (UID: \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\") " pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:50:40 crc kubenswrapper[4658]: I1002 12:50:40.512370 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-host\") pod \"crc-debug-6kcd5\" (UID: \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\") " pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:50:40 crc kubenswrapper[4658]: I1002 12:50:40.512527 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-host\") pod \"crc-debug-6kcd5\" (UID: \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\") " pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:50:40 crc kubenswrapper[4658]: I1002 12:50:40.541870 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bphtm\" (UniqueName: \"kubernetes.io/projected/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-kube-api-access-bphtm\") pod \"crc-debug-6kcd5\" (UID: \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\") " pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:50:40 crc kubenswrapper[4658]: I1002 12:50:40.692899 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:50:41 crc kubenswrapper[4658]: I1002 12:50:41.480289 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" event={"ID":"3932b79e-fb70-4697-a1e0-0008b7cf9ae3","Type":"ContainerStarted","Data":"b428ffc5cf14e1c079eac3850af8ed3c3b1c518db443dad204de1ebdb07b2653"} Oct 02 12:50:57 crc kubenswrapper[4658]: I1002 12:50:57.429508 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:50:57 crc kubenswrapper[4658]: I1002 12:50:57.430199 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:50:57 crc kubenswrapper[4658]: I1002 12:50:57.430244 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:50:57 crc kubenswrapper[4658]: I1002 12:50:57.430972 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8451f58522e374a93bce41d5d41b5de7d84687f96c2fae95ace55db12fae10c6"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:50:57 crc kubenswrapper[4658]: I1002 12:50:57.431020 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://8451f58522e374a93bce41d5d41b5de7d84687f96c2fae95ace55db12fae10c6" gracePeriod=600 Oct 02 12:50:59 crc kubenswrapper[4658]: I1002 12:50:59.671789 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="8451f58522e374a93bce41d5d41b5de7d84687f96c2fae95ace55db12fae10c6" exitCode=0 Oct 02 12:50:59 crc kubenswrapper[4658]: I1002 12:50:59.671866 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"8451f58522e374a93bce41d5d41b5de7d84687f96c2fae95ace55db12fae10c6"} Oct 02 12:50:59 crc kubenswrapper[4658]: I1002 12:50:59.672417 4658 scope.go:117] "RemoveContainer" containerID="47165bec7d768ae10c68c43eb916b40466c0776f9b4d532530ddc6a59bf4eb86" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.191930 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-llmvm"] Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.194692 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.253011 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llmvm"] Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.281493 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-utilities\") pod \"redhat-operators-llmvm\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.281550 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-catalog-content\") pod \"redhat-operators-llmvm\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.281685 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7b5\" (UniqueName: \"kubernetes.io/projected/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-kube-api-access-gj7b5\") pod \"redhat-operators-llmvm\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.383478 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7b5\" (UniqueName: \"kubernetes.io/projected/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-kube-api-access-gj7b5\") pod \"redhat-operators-llmvm\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.383555 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-utilities\") pod \"redhat-operators-llmvm\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.383584 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-catalog-content\") pod \"redhat-operators-llmvm\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.384025 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-catalog-content\") pod \"redhat-operators-llmvm\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.384484 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-utilities\") pod \"redhat-operators-llmvm\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.424808 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7b5\" (UniqueName: \"kubernetes.io/projected/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-kube-api-access-gj7b5\") pod \"redhat-operators-llmvm\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:04 crc kubenswrapper[4658]: I1002 12:51:04.578904 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:09 crc kubenswrapper[4658]: E1002 12:51:09.503307 4658 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Oct 02 12:51:09 crc kubenswrapper[4658]: E1002 12:51:09.506162 4658 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bphtm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-6kcd5_openshift-must-gather-mtlb6(3932b79e-fb70-4697-a1e0-0008b7cf9ae3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 12:51:09 crc kubenswrapper[4658]: E1002 12:51:09.507388 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" podUID="3932b79e-fb70-4697-a1e0-0008b7cf9ae3" Oct 02 12:51:09 crc kubenswrapper[4658]: I1002 12:51:09.509123 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llmvm"] Oct 02 12:51:09 crc kubenswrapper[4658]: I1002 12:51:09.775908 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llmvm" event={"ID":"d9bae60a-19e0-4eca-a610-9da8b7ee8db4","Type":"ContainerStarted","Data":"1e73365db8ebe7a2bd282d074a2b380bfc565e8855bfeb5276b9099366d81608"} Oct 02 12:51:09 crc kubenswrapper[4658]: I1002 12:51:09.779421 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600"} Oct 02 12:51:09 crc kubenswrapper[4658]: E1002 12:51:09.780542 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" podUID="3932b79e-fb70-4697-a1e0-0008b7cf9ae3" Oct 02 12:51:10 crc kubenswrapper[4658]: I1002 12:51:10.789426 4658 generic.go:334] "Generic (PLEG): container finished" podID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerID="d759308ba7e29bde5b9c46acb6bd5d25e7fac5b1ca902eba554db7f30488b826" exitCode=0 Oct 02 12:51:10 crc kubenswrapper[4658]: I1002 12:51:10.789481 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llmvm" event={"ID":"d9bae60a-19e0-4eca-a610-9da8b7ee8db4","Type":"ContainerDied","Data":"d759308ba7e29bde5b9c46acb6bd5d25e7fac5b1ca902eba554db7f30488b826"} Oct 02 12:51:13 crc kubenswrapper[4658]: I1002 12:51:13.819149 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llmvm" event={"ID":"d9bae60a-19e0-4eca-a610-9da8b7ee8db4","Type":"ContainerStarted","Data":"e24e0e66ae7e8a9a518c6755c1314a0399dde1995f67e8090f573ca394d0482b"} Oct 02 12:51:14 crc kubenswrapper[4658]: I1002 12:51:14.829131 4658 generic.go:334] "Generic (PLEG): container finished" podID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerID="e24e0e66ae7e8a9a518c6755c1314a0399dde1995f67e8090f573ca394d0482b" exitCode=0 Oct 02 12:51:14 crc kubenswrapper[4658]: I1002 12:51:14.829515 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llmvm" event={"ID":"d9bae60a-19e0-4eca-a610-9da8b7ee8db4","Type":"ContainerDied","Data":"e24e0e66ae7e8a9a518c6755c1314a0399dde1995f67e8090f573ca394d0482b"} Oct 02 12:51:23 crc kubenswrapper[4658]: I1002 12:51:23.911961 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llmvm" event={"ID":"d9bae60a-19e0-4eca-a610-9da8b7ee8db4","Type":"ContainerStarted","Data":"249d595c4fca330cca1c94883519d7676bcbf2466c129b58d73bda15aafdb0f7"} Oct 02 12:51:23 crc kubenswrapper[4658]: I1002 12:51:23.936257 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-llmvm" podStartSLOduration=7.415311832 podStartE2EDuration="19.936238574s" podCreationTimestamp="2025-10-02 12:51:04 +0000 UTC" firstStartedPulling="2025-10-02 12:51:10.792141936 +0000 UTC m=+5551.683295503" lastFinishedPulling="2025-10-02 12:51:23.313068678 +0000 UTC m=+5564.204222245" observedRunningTime="2025-10-02 12:51:23.930431079 +0000 UTC m=+5564.821584666" watchObservedRunningTime="2025-10-02 12:51:23.936238574 +0000 UTC m=+5564.827392141" Oct 02 12:51:24 crc kubenswrapper[4658]: I1002 12:51:24.580245 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:24 crc kubenswrapper[4658]: I1002 12:51:24.580286 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:25 crc kubenswrapper[4658]: I1002 12:51:25.629803 4658 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llmvm" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerName="registry-server" probeResult="failure" output=< Oct 02 12:51:25 crc kubenswrapper[4658]: timeout: failed to connect service ":50051" within 1s Oct 02 12:51:25 crc kubenswrapper[4658]: > Oct 02 12:51:27 crc kubenswrapper[4658]: I1002 12:51:27.947710 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" event={"ID":"3932b79e-fb70-4697-a1e0-0008b7cf9ae3","Type":"ContainerStarted","Data":"61244543cbc6c29edf40b5592f7a6395eb1b72a1a3132c344989254b28f9c772"} Oct 02 12:51:27 crc kubenswrapper[4658]: I1002 12:51:27.962538 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" podStartSLOduration=1.5212824999999999 podStartE2EDuration="47.962518254s" podCreationTimestamp="2025-10-02 12:50:40 +0000 UTC" firstStartedPulling="2025-10-02 12:50:40.755522981 +0000 UTC m=+5521.646676548" lastFinishedPulling="2025-10-02 12:51:27.196758735 +0000 UTC m=+5568.087912302" observedRunningTime="2025-10-02 12:51:27.96179934 +0000 UTC m=+5568.852952907" watchObservedRunningTime="2025-10-02 12:51:27.962518254 +0000 UTC m=+5568.853671821" Oct 02 12:51:34 crc kubenswrapper[4658]: I1002 12:51:34.637692 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:34 crc kubenswrapper[4658]: I1002 12:51:34.699758 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:35 crc kubenswrapper[4658]: I1002 12:51:35.393451 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llmvm"] Oct 02 12:51:36 crc kubenswrapper[4658]: I1002 12:51:36.077286 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-llmvm" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerName="registry-server" containerID="cri-o://249d595c4fca330cca1c94883519d7676bcbf2466c129b58d73bda15aafdb0f7" gracePeriod=2 Oct 02 12:51:37 crc kubenswrapper[4658]: I1002 12:51:37.089584 4658 generic.go:334] "Generic (PLEG): container finished" podID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerID="249d595c4fca330cca1c94883519d7676bcbf2466c129b58d73bda15aafdb0f7" exitCode=0 Oct 02 12:51:37 crc kubenswrapper[4658]: I1002 12:51:37.089633 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llmvm" event={"ID":"d9bae60a-19e0-4eca-a610-9da8b7ee8db4","Type":"ContainerDied","Data":"249d595c4fca330cca1c94883519d7676bcbf2466c129b58d73bda15aafdb0f7"} Oct 02 12:51:37 crc kubenswrapper[4658]: I1002 12:51:37.895672 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:37 crc kubenswrapper[4658]: I1002 12:51:37.971436 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-utilities\") pod \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " Oct 02 12:51:37 crc kubenswrapper[4658]: I1002 12:51:37.971707 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj7b5\" (UniqueName: \"kubernetes.io/projected/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-kube-api-access-gj7b5\") pod \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " Oct 02 12:51:37 crc kubenswrapper[4658]: I1002 12:51:37.971796 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-catalog-content\") pod \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\" (UID: \"d9bae60a-19e0-4eca-a610-9da8b7ee8db4\") " Oct 02 12:51:37 crc kubenswrapper[4658]: I1002 12:51:37.972487 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-utilities" (OuterVolumeSpecName: "utilities") pod "d9bae60a-19e0-4eca-a610-9da8b7ee8db4" (UID: "d9bae60a-19e0-4eca-a610-9da8b7ee8db4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:51:37 crc kubenswrapper[4658]: I1002 12:51:37.974704 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:37 crc kubenswrapper[4658]: I1002 12:51:37.981634 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-kube-api-access-gj7b5" (OuterVolumeSpecName: "kube-api-access-gj7b5") pod "d9bae60a-19e0-4eca-a610-9da8b7ee8db4" (UID: "d9bae60a-19e0-4eca-a610-9da8b7ee8db4"). InnerVolumeSpecName "kube-api-access-gj7b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.078113 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj7b5\" (UniqueName: \"kubernetes.io/projected/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-kube-api-access-gj7b5\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.081650 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9bae60a-19e0-4eca-a610-9da8b7ee8db4" (UID: "d9bae60a-19e0-4eca-a610-9da8b7ee8db4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.107677 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llmvm" event={"ID":"d9bae60a-19e0-4eca-a610-9da8b7ee8db4","Type":"ContainerDied","Data":"1e73365db8ebe7a2bd282d074a2b380bfc565e8855bfeb5276b9099366d81608"} Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.107741 4658 scope.go:117] "RemoveContainer" containerID="249d595c4fca330cca1c94883519d7676bcbf2466c129b58d73bda15aafdb0f7" Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.107896 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llmvm" Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.132160 4658 scope.go:117] "RemoveContainer" containerID="e24e0e66ae7e8a9a518c6755c1314a0399dde1995f67e8090f573ca394d0482b" Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.155869 4658 scope.go:117] "RemoveContainer" containerID="d759308ba7e29bde5b9c46acb6bd5d25e7fac5b1ca902eba554db7f30488b826" Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.180369 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9bae60a-19e0-4eca-a610-9da8b7ee8db4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.207669 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llmvm"] Oct 02 12:51:38 crc kubenswrapper[4658]: I1002 12:51:38.221714 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-llmvm"] Oct 02 12:51:39 crc kubenswrapper[4658]: I1002 12:51:39.961900 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" path="/var/lib/kubelet/pods/d9bae60a-19e0-4eca-a610-9da8b7ee8db4/volumes" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.809287 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-54pfk"] Oct 02 12:51:43 crc kubenswrapper[4658]: E1002 12:51:43.811066 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerName="extract-content" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.811079 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerName="extract-content" Oct 02 12:51:43 crc kubenswrapper[4658]: E1002 12:51:43.811099 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerName="extract-utilities" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.811105 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerName="extract-utilities" Oct 02 12:51:43 crc kubenswrapper[4658]: E1002 12:51:43.811145 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerName="registry-server" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.811151 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerName="registry-server" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.811567 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9bae60a-19e0-4eca-a610-9da8b7ee8db4" containerName="registry-server" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.814258 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.872726 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54pfk"] Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.886074 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-utilities\") pod \"community-operators-54pfk\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.886155 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-catalog-content\") pod \"community-operators-54pfk\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.886242 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdzt9\" (UniqueName: \"kubernetes.io/projected/06278da9-c0dc-4124-9d7b-ea23ad9375cb-kube-api-access-mdzt9\") pod \"community-operators-54pfk\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.988691 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdzt9\" (UniqueName: \"kubernetes.io/projected/06278da9-c0dc-4124-9d7b-ea23ad9375cb-kube-api-access-mdzt9\") pod \"community-operators-54pfk\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.989172 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-utilities\") pod \"community-operators-54pfk\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.989243 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-catalog-content\") pod \"community-operators-54pfk\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.989906 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-catalog-content\") pod \"community-operators-54pfk\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:43 crc kubenswrapper[4658]: I1002 12:51:43.991524 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-utilities\") pod \"community-operators-54pfk\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:44 crc kubenswrapper[4658]: I1002 12:51:44.009797 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdzt9\" (UniqueName: \"kubernetes.io/projected/06278da9-c0dc-4124-9d7b-ea23ad9375cb-kube-api-access-mdzt9\") pod \"community-operators-54pfk\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:44 crc kubenswrapper[4658]: I1002 12:51:44.545627 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:45 crc kubenswrapper[4658]: I1002 12:51:45.060839 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54pfk"] Oct 02 12:51:45 crc kubenswrapper[4658]: I1002 12:51:45.190061 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54pfk" event={"ID":"06278da9-c0dc-4124-9d7b-ea23ad9375cb","Type":"ContainerStarted","Data":"d6c9b471c6057476c837b6828509b8e6a31c2d4d9baf97b41f9afde06bf40bf2"} Oct 02 12:51:45 crc kubenswrapper[4658]: E1002 12:51:45.480361 4658 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06278da9_c0dc_4124_9d7b_ea23ad9375cb.slice/crio-951e80dc1885e76d0fc23c4154668e536d592bf788e35cfbba65adbd9b8b63b7.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:51:46 crc kubenswrapper[4658]: I1002 12:51:46.211576 4658 generic.go:334] "Generic (PLEG): container finished" podID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerID="951e80dc1885e76d0fc23c4154668e536d592bf788e35cfbba65adbd9b8b63b7" exitCode=0 Oct 02 12:51:46 crc kubenswrapper[4658]: I1002 12:51:46.211671 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54pfk" event={"ID":"06278da9-c0dc-4124-9d7b-ea23ad9375cb","Type":"ContainerDied","Data":"951e80dc1885e76d0fc23c4154668e536d592bf788e35cfbba65adbd9b8b63b7"} Oct 02 12:51:47 crc kubenswrapper[4658]: I1002 12:51:47.221310 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54pfk" event={"ID":"06278da9-c0dc-4124-9d7b-ea23ad9375cb","Type":"ContainerStarted","Data":"0aea70f271c4a30b416a0c12e8c0fcf9814aeb02fdaa5968883ee8a6cdda3cfd"} Oct 02 12:51:48 crc kubenswrapper[4658]: I1002 12:51:48.239325 4658 generic.go:334] "Generic (PLEG): container finished" podID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerID="0aea70f271c4a30b416a0c12e8c0fcf9814aeb02fdaa5968883ee8a6cdda3cfd" exitCode=0 Oct 02 12:51:48 crc kubenswrapper[4658]: I1002 12:51:48.239375 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54pfk" event={"ID":"06278da9-c0dc-4124-9d7b-ea23ad9375cb","Type":"ContainerDied","Data":"0aea70f271c4a30b416a0c12e8c0fcf9814aeb02fdaa5968883ee8a6cdda3cfd"} Oct 02 12:51:49 crc kubenswrapper[4658]: I1002 12:51:49.248820 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54pfk" event={"ID":"06278da9-c0dc-4124-9d7b-ea23ad9375cb","Type":"ContainerStarted","Data":"46861242d3da7f470306a478cb4e9df161cca6a772a91b1d062152776c7c7ccc"} Oct 02 12:51:49 crc kubenswrapper[4658]: I1002 12:51:49.267198 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-54pfk" podStartSLOduration=3.696898986 podStartE2EDuration="6.267176874s" podCreationTimestamp="2025-10-02 12:51:43 +0000 UTC" firstStartedPulling="2025-10-02 12:51:46.213484271 +0000 UTC m=+5587.104637838" lastFinishedPulling="2025-10-02 12:51:48.783762159 +0000 UTC m=+5589.674915726" observedRunningTime="2025-10-02 12:51:49.267102942 +0000 UTC m=+5590.158256539" watchObservedRunningTime="2025-10-02 12:51:49.267176874 +0000 UTC m=+5590.158330451" Oct 02 12:51:54 crc kubenswrapper[4658]: I1002 12:51:54.546754 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:54 crc kubenswrapper[4658]: I1002 12:51:54.547305 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:54 crc kubenswrapper[4658]: I1002 12:51:54.593393 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:55 crc kubenswrapper[4658]: I1002 12:51:55.364619 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:55 crc kubenswrapper[4658]: I1002 12:51:55.422776 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54pfk"] Oct 02 12:51:57 crc kubenswrapper[4658]: I1002 12:51:57.339957 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-54pfk" podUID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerName="registry-server" containerID="cri-o://46861242d3da7f470306a478cb4e9df161cca6a772a91b1d062152776c7c7ccc" gracePeriod=2 Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.305384 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbc95469d-r9kbr_456bb611-ccbc-4d1b-94bf-2ceb7d8345e3/barbican-api/0.log" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.368574 4658 generic.go:334] "Generic (PLEG): container finished" podID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerID="46861242d3da7f470306a478cb4e9df161cca6a772a91b1d062152776c7c7ccc" exitCode=0 Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.368631 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54pfk" event={"ID":"06278da9-c0dc-4124-9d7b-ea23ad9375cb","Type":"ContainerDied","Data":"46861242d3da7f470306a478cb4e9df161cca6a772a91b1d062152776c7c7ccc"} Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.368659 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54pfk" event={"ID":"06278da9-c0dc-4124-9d7b-ea23ad9375cb","Type":"ContainerDied","Data":"d6c9b471c6057476c837b6828509b8e6a31c2d4d9baf97b41f9afde06bf40bf2"} Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.368671 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c9b471c6057476c837b6828509b8e6a31c2d4d9baf97b41f9afde06bf40bf2" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.402515 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbc95469d-r9kbr_456bb611-ccbc-4d1b-94bf-2ceb7d8345e3/barbican-api-log/0.log" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.435576 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.518784 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdzt9\" (UniqueName: \"kubernetes.io/projected/06278da9-c0dc-4124-9d7b-ea23ad9375cb-kube-api-access-mdzt9\") pod \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.519109 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-catalog-content\") pod \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.519273 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-utilities\") pod \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\" (UID: \"06278da9-c0dc-4124-9d7b-ea23ad9375cb\") " Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.520535 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-utilities" (OuterVolumeSpecName: "utilities") pod "06278da9-c0dc-4124-9d7b-ea23ad9375cb" (UID: "06278da9-c0dc-4124-9d7b-ea23ad9375cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.527453 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06278da9-c0dc-4124-9d7b-ea23ad9375cb-kube-api-access-mdzt9" (OuterVolumeSpecName: "kube-api-access-mdzt9") pod "06278da9-c0dc-4124-9d7b-ea23ad9375cb" (UID: "06278da9-c0dc-4124-9d7b-ea23ad9375cb"). InnerVolumeSpecName "kube-api-access-mdzt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.564777 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06278da9-c0dc-4124-9d7b-ea23ad9375cb" (UID: "06278da9-c0dc-4124-9d7b-ea23ad9375cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.605015 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-54ff5bbf66-pmxfv_ed9f1355-f34e-479c-8030-c2848860beb6/barbican-keystone-listener/0.log" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.621152 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.621192 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdzt9\" (UniqueName: \"kubernetes.io/projected/06278da9-c0dc-4124-9d7b-ea23ad9375cb-kube-api-access-mdzt9\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.621205 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06278da9-c0dc-4124-9d7b-ea23ad9375cb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.734022 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-54ff5bbf66-pmxfv_ed9f1355-f34e-479c-8030-c2848860beb6/barbican-keystone-listener-log/0.log" Oct 02 12:51:58 crc kubenswrapper[4658]: I1002 12:51:58.795049 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-698b689fd7-9wp8g_e5fc61f1-3fdf-430c-890e-4e220859285b/barbican-worker/0.log" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.094101 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-698b689fd7-9wp8g_e5fc61f1-3fdf-430c-890e-4e220859285b/barbican-worker-log/0.log" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.135126 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx_3e768ea4-04c3-4825-9431-a37f41f34a01/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.375921 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54pfk" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.414140 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54pfk"] Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.417094 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4ef5828e-3cb4-4a6d-ba04-f474234450d3/ceilometer-notification-agent/0.log" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.422521 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-54pfk"] Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.426952 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4ef5828e-3cb4-4a6d-ba04-f474234450d3/ceilometer-central-agent/0.log" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.519097 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4ef5828e-3cb4-4a6d-ba04-f474234450d3/proxy-httpd/0.log" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.596163 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4ef5828e-3cb4-4a6d-ba04-f474234450d3/sg-core/0.log" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.828170 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ca5cc232-0768-4541-b654-03a61ffd7ddc/cinder-api-log/0.log" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.917022 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ca5cc232-0768-4541-b654-03a61ffd7ddc/cinder-api/0.log" Oct 02 12:51:59 crc kubenswrapper[4658]: I1002 12:51:59.960128 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" path="/var/lib/kubelet/pods/06278da9-c0dc-4124-9d7b-ea23ad9375cb/volumes" Oct 02 12:52:00 crc kubenswrapper[4658]: I1002 12:52:00.055020 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_efbe9a47-e907-4393-8f6a-9e1a824383f4/cinder-scheduler/0.log" Oct 02 12:52:00 crc kubenswrapper[4658]: I1002 12:52:00.130307 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_efbe9a47-e907-4393-8f6a-9e1a824383f4/probe/0.log" Oct 02 12:52:00 crc kubenswrapper[4658]: I1002 12:52:00.496393 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn_6eed4da6-fdf5-4db6-9e72-1d3052a54482/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:00 crc kubenswrapper[4658]: I1002 12:52:00.788193 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p_c5792ae0-4758-472c-94b6-b4f313cc3462/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:00 crc kubenswrapper[4658]: I1002 12:52:00.917822 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kzzml_09073a04-723b-4564-8f3a-efbc628cb7ef/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:01 crc kubenswrapper[4658]: I1002 12:52:01.191279 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6bcf8b9d95-6qc52_d2ab47cf-8dcb-4517-b4de-a064181594e0/init/0.log" Oct 02 12:52:01 crc kubenswrapper[4658]: I1002 12:52:01.386881 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6bcf8b9d95-6qc52_d2ab47cf-8dcb-4517-b4de-a064181594e0/init/0.log" Oct 02 12:52:01 crc kubenswrapper[4658]: I1002 12:52:01.539864 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6b65f_43792b79-e840-4c83-b2b9-8068765b000a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:01 crc kubenswrapper[4658]: I1002 12:52:01.596932 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6bcf8b9d95-6qc52_d2ab47cf-8dcb-4517-b4de-a064181594e0/dnsmasq-dns/0.log" Oct 02 12:52:01 crc kubenswrapper[4658]: I1002 12:52:01.733726 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_67f8b15f-e190-40d6-8b7b-e8ba932f00f9/glance-httpd/0.log" Oct 02 12:52:01 crc kubenswrapper[4658]: I1002 12:52:01.798038 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_67f8b15f-e190-40d6-8b7b-e8ba932f00f9/glance-log/0.log" Oct 02 12:52:01 crc kubenswrapper[4658]: I1002 12:52:01.892779 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d6306f11-af13-4078-ad43-b00e333855b1/glance-log/0.log" Oct 02 12:52:01 crc kubenswrapper[4658]: I1002 12:52:01.964792 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d6306f11-af13-4078-ad43-b00e333855b1/glance-httpd/0.log" Oct 02 12:52:02 crc kubenswrapper[4658]: I1002 12:52:02.249640 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-776f4bfd7b-cm7vj_02408c48-14d8-4a7b-8ebf-79fd2fa1b924/horizon/0.log" Oct 02 12:52:02 crc kubenswrapper[4658]: I1002 12:52:02.270464 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-776f4bfd7b-cm7vj_02408c48-14d8-4a7b-8ebf-79fd2fa1b924/horizon/1.log" Oct 02 12:52:02 crc kubenswrapper[4658]: I1002 12:52:02.522947 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq_8d5900ee-9fca-4a00-8343-b51c6728627d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:02 crc kubenswrapper[4658]: I1002 12:52:02.637588 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xk4g7_59aa0d09-3a44-4e0a-b2d2-7f297a223854/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:02 crc kubenswrapper[4658]: I1002 12:52:02.852751 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-776f4bfd7b-cm7vj_02408c48-14d8-4a7b-8ebf-79fd2fa1b924/horizon-log/0.log" Oct 02 12:52:03 crc kubenswrapper[4658]: I1002 12:52:03.114099 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29323441-htvr7_a39e700e-3d2a-4deb-8ab5-ad53c0cf8276/keystone-cron/0.log" Oct 02 12:52:03 crc kubenswrapper[4658]: I1002 12:52:03.162864 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f67801c0-f438-43ae-a45b-c2870b64f553/kube-state-metrics/0.log" Oct 02 12:52:03 crc kubenswrapper[4658]: I1002 12:52:03.344992 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7db8df9d95-jgkgn_e57e6b14-51e6-4efb-ba74-8e57b5e3aa72/keystone-api/0.log" Oct 02 12:52:03 crc kubenswrapper[4658]: I1002 12:52:03.498071 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-59fjq_074ed90b-9bda-4d7f-819d-41f3e7569ac4/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:04 crc kubenswrapper[4658]: I1002 12:52:04.077510 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6989c4ffd5-z7vdb_299ba238-fcb8-4f4b-94ea-73ac08404680/neutron-httpd/0.log" Oct 02 12:52:04 crc kubenswrapper[4658]: I1002 12:52:04.126291 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6989c4ffd5-z7vdb_299ba238-fcb8-4f4b-94ea-73ac08404680/neutron-api/0.log" Oct 02 12:52:04 crc kubenswrapper[4658]: I1002 12:52:04.418785 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb_ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:05 crc kubenswrapper[4658]: I1002 12:52:05.162550 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8441c161-18f6-46d9-a327-ac3857d077d2/nova-cell0-conductor-conductor/0.log" Oct 02 12:52:05 crc kubenswrapper[4658]: I1002 12:52:05.798558 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d623c2ea-e4e8-4031-af93-35f76f08dba2/nova-cell1-conductor-conductor/0.log" Oct 02 12:52:06 crc kubenswrapper[4658]: I1002 12:52:06.011756 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2b26ff3c-8765-4911-aee2-54a863e4fd7c/nova-api-log/0.log" Oct 02 12:52:06 crc kubenswrapper[4658]: I1002 12:52:06.296243 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2b26ff3c-8765-4911-aee2-54a863e4fd7c/nova-api-api/0.log" Oct 02 12:52:06 crc kubenswrapper[4658]: I1002 12:52:06.340050 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7e69ac9b-be4b-4d88-bf64-06f4ca3966ba/nova-cell1-novncproxy-novncproxy/0.log" Oct 02 12:52:06 crc kubenswrapper[4658]: I1002 12:52:06.600681 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qg2dq_4d537487-cd7a-43bd-ba29-fc9df6af7913/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:06 crc kubenswrapper[4658]: I1002 12:52:06.721679 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f818de7d-6833-4011-aded-a3de906237c4/nova-metadata-log/0.log" Oct 02 12:52:07 crc kubenswrapper[4658]: I1002 12:52:07.319233 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c066a72f-72df-47f5-b481-12ba73cb8d5f/nova-scheduler-scheduler/0.log" Oct 02 12:52:07 crc kubenswrapper[4658]: I1002 12:52:07.396108 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ecaec123-d0cf-493f-bee4-b32cd4f084bf/mysql-bootstrap/0.log" Oct 02 12:52:07 crc kubenswrapper[4658]: I1002 12:52:07.567158 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ecaec123-d0cf-493f-bee4-b32cd4f084bf/mysql-bootstrap/0.log" Oct 02 12:52:07 crc kubenswrapper[4658]: I1002 12:52:07.640236 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ecaec123-d0cf-493f-bee4-b32cd4f084bf/galera/0.log" Oct 02 12:52:07 crc kubenswrapper[4658]: I1002 12:52:07.945946 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_590179b8-356d-4392-bab5-037103481383/mysql-bootstrap/0.log" Oct 02 12:52:08 crc kubenswrapper[4658]: I1002 12:52:08.228518 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_590179b8-356d-4392-bab5-037103481383/mysql-bootstrap/0.log" Oct 02 12:52:08 crc kubenswrapper[4658]: I1002 12:52:08.258517 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_590179b8-356d-4392-bab5-037103481383/galera/0.log" Oct 02 12:52:08 crc kubenswrapper[4658]: I1002 12:52:08.509939 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_53d4842f-7f97-4191-bcea-c8076517503f/openstackclient/0.log" Oct 02 12:52:08 crc kubenswrapper[4658]: I1002 12:52:08.821792 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-h2htr_ed2f1df6-db7a-483e-a80d-298f12a389c8/ovn-controller/0.log" Oct 02 12:52:09 crc kubenswrapper[4658]: I1002 12:52:09.105463 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rjq6k_313d8a11-a864-4fe8-b083-cc3f713cd4f7/openstack-network-exporter/0.log" Oct 02 12:52:09 crc kubenswrapper[4658]: I1002 12:52:09.242168 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f818de7d-6833-4011-aded-a3de906237c4/nova-metadata-metadata/0.log" Oct 02 12:52:09 crc kubenswrapper[4658]: I1002 12:52:09.348776 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbnj8_ff110d7e-a1dd-4a53-99c8-995af4a9d039/ovsdb-server-init/0.log" Oct 02 12:52:09 crc kubenswrapper[4658]: I1002 12:52:09.594246 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbnj8_ff110d7e-a1dd-4a53-99c8-995af4a9d039/ovsdb-server-init/0.log" Oct 02 12:52:09 crc kubenswrapper[4658]: I1002 12:52:09.641961 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbnj8_ff110d7e-a1dd-4a53-99c8-995af4a9d039/ovs-vswitchd/0.log" Oct 02 12:52:09 crc kubenswrapper[4658]: I1002 12:52:09.673042 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbnj8_ff110d7e-a1dd-4a53-99c8-995af4a9d039/ovsdb-server/0.log" Oct 02 12:52:09 crc kubenswrapper[4658]: I1002 12:52:09.953180 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-gkwgw_3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:10 crc kubenswrapper[4658]: I1002 12:52:10.362318 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_57cd238e-33f1-4536-bcf1-1ca7e57a141a/openstack-network-exporter/0.log" Oct 02 12:52:10 crc kubenswrapper[4658]: I1002 12:52:10.424519 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_57cd238e-33f1-4536-bcf1-1ca7e57a141a/ovn-northd/0.log" Oct 02 12:52:10 crc kubenswrapper[4658]: I1002 12:52:10.577758 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e17b8e1f-e0a9-4648-b16b-1f62fa63d507/openstack-network-exporter/0.log" Oct 02 12:52:10 crc kubenswrapper[4658]: I1002 12:52:10.692773 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e17b8e1f-e0a9-4648-b16b-1f62fa63d507/ovsdbserver-nb/0.log" Oct 02 12:52:10 crc kubenswrapper[4658]: I1002 12:52:10.887852 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_44a349ce-b770-4e0a-bc23-afb9bdea6eba/openstack-network-exporter/0.log" Oct 02 12:52:10 crc kubenswrapper[4658]: I1002 12:52:10.942407 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_44a349ce-b770-4e0a-bc23-afb9bdea6eba/ovsdbserver-sb/0.log" Oct 02 12:52:11 crc kubenswrapper[4658]: I1002 12:52:11.404725 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-574d544bd8-7g449_c77ff071-5d94-49df-a4b3-25c8dd727b6e/placement-api/0.log" Oct 02 12:52:11 crc kubenswrapper[4658]: I1002 12:52:11.494320 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/init-config-reloader/0.log" Oct 02 12:52:11 crc kubenswrapper[4658]: I1002 12:52:11.522283 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-574d544bd8-7g449_c77ff071-5d94-49df-a4b3-25c8dd727b6e/placement-log/0.log" Oct 02 12:52:11 crc kubenswrapper[4658]: I1002 12:52:11.756938 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/config-reloader/0.log" Oct 02 12:52:11 crc kubenswrapper[4658]: I1002 12:52:11.800467 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/init-config-reloader/0.log" Oct 02 12:52:11 crc kubenswrapper[4658]: I1002 12:52:11.816004 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/prometheus/0.log" Oct 02 12:52:12 crc kubenswrapper[4658]: I1002 12:52:12.057022 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/thanos-sidecar/0.log" Oct 02 12:52:12 crc kubenswrapper[4658]: I1002 12:52:12.064585 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c6406a7e-4303-43ed-bb07-2816e29af04c/setup-container/0.log" Oct 02 12:52:12 crc kubenswrapper[4658]: I1002 12:52:12.288610 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c6406a7e-4303-43ed-bb07-2816e29af04c/setup-container/0.log" Oct 02 12:52:12 crc kubenswrapper[4658]: I1002 12:52:12.302996 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c6406a7e-4303-43ed-bb07-2816e29af04c/rabbitmq/0.log" Oct 02 12:52:12 crc kubenswrapper[4658]: I1002 12:52:12.537580 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a129e57-376b-4bc6-8d0c-c667d692d487/setup-container/0.log" Oct 02 12:52:12 crc kubenswrapper[4658]: I1002 12:52:12.699916 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a129e57-376b-4bc6-8d0c-c667d692d487/setup-container/0.log" Oct 02 12:52:12 crc kubenswrapper[4658]: I1002 12:52:12.764575 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a129e57-376b-4bc6-8d0c-c667d692d487/rabbitmq/0.log" Oct 02 12:52:12 crc kubenswrapper[4658]: I1002 12:52:12.939950 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8_f8637fd5-d51c-4da2-a043-98c8f655f10f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:13 crc kubenswrapper[4658]: I1002 12:52:13.151283 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-kwgw7_270f59c2-b21f-4b38-821c-5c1b4ce0be21/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:13 crc kubenswrapper[4658]: I1002 12:52:13.302366 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb_4dbacd18-944b-4b5f-be12-5ac2c1cb163a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:13 crc kubenswrapper[4658]: I1002 12:52:13.588932 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bztkh_26a7e52d-c3b7-4a7d-ae46-c2f32adb479a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:13 crc kubenswrapper[4658]: I1002 12:52:13.726080 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dz25k_5a2e4e7a-11ed-4e29-b2f3-28919813fa63/ssh-known-hosts-edpm-deployment/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.091413 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5566488b4c-k88mg_67435e65-47df-41df-9570-df74c35bd5fc/proxy-server/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.096111 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5566488b4c-k88mg_67435e65-47df-41df-9570-df74c35bd5fc/proxy-httpd/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.318866 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-fkzqr_0909c66f-f3c6-440c-add2-8784d1c209c7/swift-ring-rebalance/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.326475 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/account-auditor/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.561550 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/account-replicator/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.584017 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/account-reaper/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.617717 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/account-server/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.790704 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/container-auditor/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.859085 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/container-server/0.log" Oct 02 12:52:14 crc kubenswrapper[4658]: I1002 12:52:14.870200 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/container-replicator/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.028091 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/container-updater/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.073232 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-expirer/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.103751 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-auditor/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.262065 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-replicator/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.306202 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-server/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.330238 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-updater/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.511641 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/rsync/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.553253 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/swift-recon-cron/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.730903 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp_7d923299-fe7c-4ece-8f48-7c95a141f4c8/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:15 crc kubenswrapper[4658]: I1002 12:52:15.870799 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_fd9ceedd-f5a7-425a-9112-998edc1d3e00/tempest-tests-tempest-tests-runner/0.log" Oct 02 12:52:16 crc kubenswrapper[4658]: I1002 12:52:16.054534 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9fb066d3-ce67-4635-bebd-2e24da16a2a8/test-operator-logs-container/0.log" Oct 02 12:52:16 crc kubenswrapper[4658]: I1002 12:52:16.305502 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5_bfda0e17-a4e9-4a4f-9678-418901ed432a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:52:17 crc kubenswrapper[4658]: I1002 12:52:17.320040 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_dba2292e-4150-4a9d-9b22-49482e381c6c/watcher-applier/0.log" Oct 02 12:52:17 crc kubenswrapper[4658]: I1002 12:52:17.707056 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_a963ca85-eeb4-4678-849f-b5b980b36091/watcher-api-log/0.log" Oct 02 12:52:18 crc kubenswrapper[4658]: I1002 12:52:18.194808 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3f3cc404-a92f-4ef8-a799-83eb314e4382/memcached/0.log" Oct 02 12:52:18 crc kubenswrapper[4658]: I1002 12:52:18.944259 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_34ba94d4-e1db-40a9-93e7-5a4e053ae8db/watcher-decision-engine/0.log" Oct 02 12:52:20 crc kubenswrapper[4658]: I1002 12:52:20.341662 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_a963ca85-eeb4-4678-849f-b5b980b36091/watcher-api/0.log" Oct 02 12:53:27 crc kubenswrapper[4658]: I1002 12:53:27.429290 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:53:27 crc kubenswrapper[4658]: I1002 12:53:27.429884 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:53:44 crc kubenswrapper[4658]: I1002 12:53:44.498022 4658 generic.go:334] "Generic (PLEG): container finished" podID="3932b79e-fb70-4697-a1e0-0008b7cf9ae3" containerID="61244543cbc6c29edf40b5592f7a6395eb1b72a1a3132c344989254b28f9c772" exitCode=0 Oct 02 12:53:44 crc kubenswrapper[4658]: I1002 12:53:44.498164 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" event={"ID":"3932b79e-fb70-4697-a1e0-0008b7cf9ae3","Type":"ContainerDied","Data":"61244543cbc6c29edf40b5592f7a6395eb1b72a1a3132c344989254b28f9c772"} Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.642335 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.692345 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mtlb6/crc-debug-6kcd5"] Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.703913 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mtlb6/crc-debug-6kcd5"] Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.832317 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-host\") pod \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\" (UID: \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\") " Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.832442 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bphtm\" (UniqueName: \"kubernetes.io/projected/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-kube-api-access-bphtm\") pod \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\" (UID: \"3932b79e-fb70-4697-a1e0-0008b7cf9ae3\") " Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.832444 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-host" (OuterVolumeSpecName: "host") pod "3932b79e-fb70-4697-a1e0-0008b7cf9ae3" (UID: "3932b79e-fb70-4697-a1e0-0008b7cf9ae3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.832835 4658 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-host\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.841788 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-kube-api-access-bphtm" (OuterVolumeSpecName: "kube-api-access-bphtm") pod "3932b79e-fb70-4697-a1e0-0008b7cf9ae3" (UID: "3932b79e-fb70-4697-a1e0-0008b7cf9ae3"). InnerVolumeSpecName "kube-api-access-bphtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.935217 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bphtm\" (UniqueName: \"kubernetes.io/projected/3932b79e-fb70-4697-a1e0-0008b7cf9ae3-kube-api-access-bphtm\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:45 crc kubenswrapper[4658]: I1002 12:53:45.966705 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3932b79e-fb70-4697-a1e0-0008b7cf9ae3" path="/var/lib/kubelet/pods/3932b79e-fb70-4697-a1e0-0008b7cf9ae3/volumes" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.530005 4658 scope.go:117] "RemoveContainer" containerID="61244543cbc6c29edf40b5592f7a6395eb1b72a1a3132c344989254b28f9c772" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.530044 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-6kcd5" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.902492 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mtlb6/crc-debug-k9sks"] Oct 02 12:53:46 crc kubenswrapper[4658]: E1002 12:53:46.902934 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3932b79e-fb70-4697-a1e0-0008b7cf9ae3" containerName="container-00" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.902948 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="3932b79e-fb70-4697-a1e0-0008b7cf9ae3" containerName="container-00" Oct 02 12:53:46 crc kubenswrapper[4658]: E1002 12:53:46.902959 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerName="extract-content" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.902969 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerName="extract-content" Oct 02 12:53:46 crc kubenswrapper[4658]: E1002 12:53:46.902994 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerName="registry-server" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.903003 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerName="registry-server" Oct 02 12:53:46 crc kubenswrapper[4658]: E1002 12:53:46.903024 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerName="extract-utilities" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.903031 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerName="extract-utilities" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.903276 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="06278da9-c0dc-4124-9d7b-ea23ad9375cb" containerName="registry-server" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.903324 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="3932b79e-fb70-4697-a1e0-0008b7cf9ae3" containerName="container-00" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.904108 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.959666 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lrmt\" (UniqueName: \"kubernetes.io/projected/6f46358b-e1e2-4dde-b808-9909ed0872aa-kube-api-access-7lrmt\") pod \"crc-debug-k9sks\" (UID: \"6f46358b-e1e2-4dde-b808-9909ed0872aa\") " pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:46 crc kubenswrapper[4658]: I1002 12:53:46.960041 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f46358b-e1e2-4dde-b808-9909ed0872aa-host\") pod \"crc-debug-k9sks\" (UID: \"6f46358b-e1e2-4dde-b808-9909ed0872aa\") " pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:47 crc kubenswrapper[4658]: I1002 12:53:47.061400 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f46358b-e1e2-4dde-b808-9909ed0872aa-host\") pod \"crc-debug-k9sks\" (UID: \"6f46358b-e1e2-4dde-b808-9909ed0872aa\") " pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:47 crc kubenswrapper[4658]: I1002 12:53:47.061570 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lrmt\" (UniqueName: \"kubernetes.io/projected/6f46358b-e1e2-4dde-b808-9909ed0872aa-kube-api-access-7lrmt\") pod \"crc-debug-k9sks\" (UID: \"6f46358b-e1e2-4dde-b808-9909ed0872aa\") " pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:47 crc kubenswrapper[4658]: I1002 12:53:47.061642 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f46358b-e1e2-4dde-b808-9909ed0872aa-host\") pod \"crc-debug-k9sks\" (UID: \"6f46358b-e1e2-4dde-b808-9909ed0872aa\") " pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:47 crc kubenswrapper[4658]: I1002 12:53:47.092495 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lrmt\" (UniqueName: \"kubernetes.io/projected/6f46358b-e1e2-4dde-b808-9909ed0872aa-kube-api-access-7lrmt\") pod \"crc-debug-k9sks\" (UID: \"6f46358b-e1e2-4dde-b808-9909ed0872aa\") " pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:47 crc kubenswrapper[4658]: I1002 12:53:47.228981 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:47 crc kubenswrapper[4658]: I1002 12:53:47.543058 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/crc-debug-k9sks" event={"ID":"6f46358b-e1e2-4dde-b808-9909ed0872aa","Type":"ContainerStarted","Data":"08abc25f9255ab4102f90a64dd5c1053ea21aa34f17ca33252f0d3ea5307b5d5"} Oct 02 12:53:47 crc kubenswrapper[4658]: I1002 12:53:47.543320 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/crc-debug-k9sks" event={"ID":"6f46358b-e1e2-4dde-b808-9909ed0872aa","Type":"ContainerStarted","Data":"d290498fa9d27eed71f8ed7fdd4bde196528ea335202c2650efe3c809edb5960"} Oct 02 12:53:47 crc kubenswrapper[4658]: I1002 12:53:47.557822 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mtlb6/crc-debug-k9sks" podStartSLOduration=1.557806112 podStartE2EDuration="1.557806112s" podCreationTimestamp="2025-10-02 12:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:53:47.557178952 +0000 UTC m=+5708.448332519" watchObservedRunningTime="2025-10-02 12:53:47.557806112 +0000 UTC m=+5708.448959679" Oct 02 12:53:48 crc kubenswrapper[4658]: I1002 12:53:48.552134 4658 generic.go:334] "Generic (PLEG): container finished" podID="6f46358b-e1e2-4dde-b808-9909ed0872aa" containerID="08abc25f9255ab4102f90a64dd5c1053ea21aa34f17ca33252f0d3ea5307b5d5" exitCode=0 Oct 02 12:53:48 crc kubenswrapper[4658]: I1002 12:53:48.552194 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/crc-debug-k9sks" event={"ID":"6f46358b-e1e2-4dde-b808-9909ed0872aa","Type":"ContainerDied","Data":"08abc25f9255ab4102f90a64dd5c1053ea21aa34f17ca33252f0d3ea5307b5d5"} Oct 02 12:53:49 crc kubenswrapper[4658]: I1002 12:53:49.673204 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:49 crc kubenswrapper[4658]: I1002 12:53:49.807778 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f46358b-e1e2-4dde-b808-9909ed0872aa-host\") pod \"6f46358b-e1e2-4dde-b808-9909ed0872aa\" (UID: \"6f46358b-e1e2-4dde-b808-9909ed0872aa\") " Oct 02 12:53:49 crc kubenswrapper[4658]: I1002 12:53:49.807856 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f46358b-e1e2-4dde-b808-9909ed0872aa-host" (OuterVolumeSpecName: "host") pod "6f46358b-e1e2-4dde-b808-9909ed0872aa" (UID: "6f46358b-e1e2-4dde-b808-9909ed0872aa"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:53:49 crc kubenswrapper[4658]: I1002 12:53:49.807891 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lrmt\" (UniqueName: \"kubernetes.io/projected/6f46358b-e1e2-4dde-b808-9909ed0872aa-kube-api-access-7lrmt\") pod \"6f46358b-e1e2-4dde-b808-9909ed0872aa\" (UID: \"6f46358b-e1e2-4dde-b808-9909ed0872aa\") " Oct 02 12:53:49 crc kubenswrapper[4658]: I1002 12:53:49.808472 4658 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f46358b-e1e2-4dde-b808-9909ed0872aa-host\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:49 crc kubenswrapper[4658]: I1002 12:53:49.821921 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f46358b-e1e2-4dde-b808-9909ed0872aa-kube-api-access-7lrmt" (OuterVolumeSpecName: "kube-api-access-7lrmt") pod "6f46358b-e1e2-4dde-b808-9909ed0872aa" (UID: "6f46358b-e1e2-4dde-b808-9909ed0872aa"). InnerVolumeSpecName "kube-api-access-7lrmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:53:49 crc kubenswrapper[4658]: I1002 12:53:49.909940 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lrmt\" (UniqueName: \"kubernetes.io/projected/6f46358b-e1e2-4dde-b808-9909ed0872aa-kube-api-access-7lrmt\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:50 crc kubenswrapper[4658]: I1002 12:53:50.570698 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/crc-debug-k9sks" event={"ID":"6f46358b-e1e2-4dde-b808-9909ed0872aa","Type":"ContainerDied","Data":"d290498fa9d27eed71f8ed7fdd4bde196528ea335202c2650efe3c809edb5960"} Oct 02 12:53:50 crc kubenswrapper[4658]: I1002 12:53:50.571081 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d290498fa9d27eed71f8ed7fdd4bde196528ea335202c2650efe3c809edb5960" Oct 02 12:53:50 crc kubenswrapper[4658]: I1002 12:53:50.570856 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-k9sks" Oct 02 12:53:57 crc kubenswrapper[4658]: I1002 12:53:57.095408 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mtlb6/crc-debug-k9sks"] Oct 02 12:53:57 crc kubenswrapper[4658]: I1002 12:53:57.103825 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mtlb6/crc-debug-k9sks"] Oct 02 12:53:57 crc kubenswrapper[4658]: I1002 12:53:57.430107 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:53:57 crc kubenswrapper[4658]: I1002 12:53:57.430179 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:53:57 crc kubenswrapper[4658]: I1002 12:53:57.961647 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f46358b-e1e2-4dde-b808-9909ed0872aa" path="/var/lib/kubelet/pods/6f46358b-e1e2-4dde-b808-9909ed0872aa/volumes" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.305912 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mtlb6/crc-debug-pxxpq"] Oct 02 12:53:58 crc kubenswrapper[4658]: E1002 12:53:58.306699 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f46358b-e1e2-4dde-b808-9909ed0872aa" containerName="container-00" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.306716 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f46358b-e1e2-4dde-b808-9909ed0872aa" containerName="container-00" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.306998 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f46358b-e1e2-4dde-b808-9909ed0872aa" containerName="container-00" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.307773 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.456861 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dccl\" (UniqueName: \"kubernetes.io/projected/dcdc0fd8-e055-4316-899f-6b6075a0eb92-kube-api-access-8dccl\") pod \"crc-debug-pxxpq\" (UID: \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\") " pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.456962 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcdc0fd8-e055-4316-899f-6b6075a0eb92-host\") pod \"crc-debug-pxxpq\" (UID: \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\") " pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.559197 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dccl\" (UniqueName: \"kubernetes.io/projected/dcdc0fd8-e055-4316-899f-6b6075a0eb92-kube-api-access-8dccl\") pod \"crc-debug-pxxpq\" (UID: \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\") " pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.559328 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcdc0fd8-e055-4316-899f-6b6075a0eb92-host\") pod \"crc-debug-pxxpq\" (UID: \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\") " pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.559425 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcdc0fd8-e055-4316-899f-6b6075a0eb92-host\") pod \"crc-debug-pxxpq\" (UID: \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\") " pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.577444 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dccl\" (UniqueName: \"kubernetes.io/projected/dcdc0fd8-e055-4316-899f-6b6075a0eb92-kube-api-access-8dccl\") pod \"crc-debug-pxxpq\" (UID: \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\") " pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:53:58 crc kubenswrapper[4658]: I1002 12:53:58.638819 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:53:59 crc kubenswrapper[4658]: E1002 12:53:59.059106 4658 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcdc0fd8_e055_4316_899f_6b6075a0eb92.slice/crio-conmon-ff6f4e5dcec3b1ae9c0d5374f03cefd4e5f7cd564d6b0cee6a51de2123f9900c.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:53:59 crc kubenswrapper[4658]: I1002 12:53:59.651462 4658 generic.go:334] "Generic (PLEG): container finished" podID="dcdc0fd8-e055-4316-899f-6b6075a0eb92" containerID="ff6f4e5dcec3b1ae9c0d5374f03cefd4e5f7cd564d6b0cee6a51de2123f9900c" exitCode=0 Oct 02 12:53:59 crc kubenswrapper[4658]: I1002 12:53:59.651715 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" event={"ID":"dcdc0fd8-e055-4316-899f-6b6075a0eb92","Type":"ContainerDied","Data":"ff6f4e5dcec3b1ae9c0d5374f03cefd4e5f7cd564d6b0cee6a51de2123f9900c"} Oct 02 12:53:59 crc kubenswrapper[4658]: I1002 12:53:59.652031 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" event={"ID":"dcdc0fd8-e055-4316-899f-6b6075a0eb92","Type":"ContainerStarted","Data":"c47a6930456c0f580e6b45c65b54fff2dfd2d21a11737bb70b16b48521f9af4d"} Oct 02 12:53:59 crc kubenswrapper[4658]: I1002 12:53:59.698857 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mtlb6/crc-debug-pxxpq"] Oct 02 12:53:59 crc kubenswrapper[4658]: I1002 12:53:59.708102 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mtlb6/crc-debug-pxxpq"] Oct 02 12:54:00 crc kubenswrapper[4658]: I1002 12:54:00.766820 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:54:00 crc kubenswrapper[4658]: I1002 12:54:00.924424 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dccl\" (UniqueName: \"kubernetes.io/projected/dcdc0fd8-e055-4316-899f-6b6075a0eb92-kube-api-access-8dccl\") pod \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\" (UID: \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\") " Oct 02 12:54:00 crc kubenswrapper[4658]: I1002 12:54:00.924478 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcdc0fd8-e055-4316-899f-6b6075a0eb92-host\") pod \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\" (UID: \"dcdc0fd8-e055-4316-899f-6b6075a0eb92\") " Oct 02 12:54:00 crc kubenswrapper[4658]: I1002 12:54:00.924934 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcdc0fd8-e055-4316-899f-6b6075a0eb92-host" (OuterVolumeSpecName: "host") pod "dcdc0fd8-e055-4316-899f-6b6075a0eb92" (UID: "dcdc0fd8-e055-4316-899f-6b6075a0eb92"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:54:00 crc kubenswrapper[4658]: I1002 12:54:00.925402 4658 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcdc0fd8-e055-4316-899f-6b6075a0eb92-host\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:00 crc kubenswrapper[4658]: I1002 12:54:00.943599 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcdc0fd8-e055-4316-899f-6b6075a0eb92-kube-api-access-8dccl" (OuterVolumeSpecName: "kube-api-access-8dccl") pod "dcdc0fd8-e055-4316-899f-6b6075a0eb92" (UID: "dcdc0fd8-e055-4316-899f-6b6075a0eb92"). InnerVolumeSpecName "kube-api-access-8dccl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.027532 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dccl\" (UniqueName: \"kubernetes.io/projected/dcdc0fd8-e055-4316-899f-6b6075a0eb92-kube-api-access-8dccl\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.321182 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/util/0.log" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.434738 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/util/0.log" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.497156 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/pull/0.log" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.542656 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/pull/0.log" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.694478 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/util/0.log" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.694805 4658 scope.go:117] "RemoveContainer" containerID="ff6f4e5dcec3b1ae9c0d5374f03cefd4e5f7cd564d6b0cee6a51de2123f9900c" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.695142 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/crc-debug-pxxpq" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.723337 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/extract/0.log" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.729308 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/pull/0.log" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.905948 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-kkldn_3f426838-95ca-4579-9745-e78f0ccab683/kube-rbac-proxy/0.log" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.966380 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcdc0fd8-e055-4316-899f-6b6075a0eb92" path="/var/lib/kubelet/pods/dcdc0fd8-e055-4316-899f-6b6075a0eb92/volumes" Oct 02 12:54:01 crc kubenswrapper[4658]: I1002 12:54:01.971816 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-kkldn_3f426838-95ca-4579-9745-e78f0ccab683/manager/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.007490 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-gckv9_7744dcc1-5c52-4447-8123-53e4c98250fd/kube-rbac-proxy/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.096996 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-gckv9_7744dcc1-5c52-4447-8123-53e4c98250fd/manager/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.146958 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-fgm4w_af944184-d59a-467d-983e-c66fb79823c6/kube-rbac-proxy/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.205899 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-fgm4w_af944184-d59a-467d-983e-c66fb79823c6/manager/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.321332 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-8ttj2_70026a4a-6db4-4777-afed-a5ea3de1fc60/kube-rbac-proxy/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.420980 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-8ttj2_70026a4a-6db4-4777-afed-a5ea3de1fc60/manager/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.493492 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-66q5b_6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c/kube-rbac-proxy/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.566806 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-66q5b_6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c/manager/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.656418 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-ljz2h_d480d1a6-c309-454f-8e99-a762feed8490/kube-rbac-proxy/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.723890 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-ljz2h_d480d1a6-c309-454f-8e99-a762feed8490/manager/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.868254 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-kznvq_f527a8e5-d051-4017-80e4-e3b2f1fd59ba/kube-rbac-proxy/0.log" Oct 02 12:54:02 crc kubenswrapper[4658]: I1002 12:54:02.945564 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-7mfsk_bf9ac0a3-4903-4115-9793-b6bd913d4e0a/kube-rbac-proxy/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.053519 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-kznvq_f527a8e5-d051-4017-80e4-e3b2f1fd59ba/manager/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.102348 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-7mfsk_bf9ac0a3-4903-4115-9793-b6bd913d4e0a/manager/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.153173 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-l62bl_d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b/kube-rbac-proxy/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.287093 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-l62bl_d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b/manager/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.330813 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-tnfxq_55b04e2c-c701-4f74-9fb6-1dce9d2de108/kube-rbac-proxy/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.387087 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-tnfxq_55b04e2c-c701-4f74-9fb6-1dce9d2de108/manager/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.585827 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-g8dwz_9787421c-8d35-4d30-8946-90bc71eba9c0/kube-rbac-proxy/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.611005 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-g8dwz_9787421c-8d35-4d30-8946-90bc71eba9c0/manager/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.697898 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-fsnf7_6a460926-8982-40c1-b177-3620aa3dcb79/kube-rbac-proxy/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.819906 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-htz9g_830f6e33-ad1f-4033-a725-9f10415996e7/kube-rbac-proxy/0.log" Oct 02 12:54:03 crc kubenswrapper[4658]: I1002 12:54:03.831623 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-fsnf7_6a460926-8982-40c1-b177-3620aa3dcb79/manager/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.037612 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-htz9g_830f6e33-ad1f-4033-a725-9f10415996e7/manager/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.049560 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-kbj6t_5aeb03f1-db88-497b-b3cb-11e01e2a7b31/kube-rbac-proxy/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.059517 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-kbj6t_5aeb03f1-db88-497b-b3cb-11e01e2a7b31/manager/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.229431 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-ffhdh_afbaa143-b11e-406d-b797-6ba114fbf9a4/kube-rbac-proxy/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.238421 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-ffhdh_afbaa143-b11e-406d-b797-6ba114fbf9a4/manager/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.368698 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f6b64f7bf-8c66j_903dfdb7-34f3-4875-8009-482cb7d5469b/kube-rbac-proxy/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.514555 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6f47f5dc76-j82tf_943e808f-860b-4f8a-a933-84f0dd0cddc5/kube-rbac-proxy/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.726949 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4jglm_71959757-609a-415a-9717-711c3f8ad66d/registry-server/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.757088 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6f47f5dc76-j82tf_943e808f-860b-4f8a-a933-84f0dd0cddc5/operator/0.log" Oct 02 12:54:04 crc kubenswrapper[4658]: I1002 12:54:04.888107 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-mhrcv_7b2e2130-4b00-4242-8254-c8be160bfe89/kube-rbac-proxy/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.010098 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-wqqdv_c802dbff-c65f-40e9-91ee-3ea6f0aee6a2/kube-rbac-proxy/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.053514 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-mhrcv_7b2e2130-4b00-4242-8254-c8be160bfe89/manager/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.180588 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-wqqdv_c802dbff-c65f-40e9-91ee-3ea6f0aee6a2/manager/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.279919 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-xw82t_75df76ba-0998-4b89-887e-d8f0b1c546b4/operator/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.575135 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-ppg68_c92dcd56-734e-430c-813e-1405ab2e141b/kube-rbac-proxy/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.613041 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-ppg68_c92dcd56-734e-430c-813e-1405ab2e141b/manager/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.615797 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f6b64f7bf-8c66j_903dfdb7-34f3-4875-8009-482cb7d5469b/manager/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.680087 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-49k5r_33b8c756-1330-4114-bf78-2b3835667a1e/kube-rbac-proxy/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.800000 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-4bhqs_3dba06c0-4986-438c-a553-76b0bcddd74c/kube-rbac-proxy/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.861409 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-4bhqs_3dba06c0-4986-438c-a553-76b0bcddd74c/manager/0.log" Oct 02 12:54:05 crc kubenswrapper[4658]: I1002 12:54:05.981371 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7fc7d86889-mqpv9_e9eb741d-265d-4f59-ab6e-c6a42f720801/kube-rbac-proxy/0.log" Oct 02 12:54:06 crc kubenswrapper[4658]: I1002 12:54:06.069597 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-49k5r_33b8c756-1330-4114-bf78-2b3835667a1e/manager/0.log" Oct 02 12:54:06 crc kubenswrapper[4658]: I1002 12:54:06.088256 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7fc7d86889-mqpv9_e9eb741d-265d-4f59-ab6e-c6a42f720801/manager/0.log" Oct 02 12:54:20 crc kubenswrapper[4658]: I1002 12:54:20.978960 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4wktl_2b161a36-8654-4948-8412-bb68940fe512/control-plane-machine-set-operator/0.log" Oct 02 12:54:21 crc kubenswrapper[4658]: I1002 12:54:21.213567 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gjt96_bfa1953c-4c82-4463-b772-6b871bcea9b8/kube-rbac-proxy/0.log" Oct 02 12:54:21 crc kubenswrapper[4658]: I1002 12:54:21.237035 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gjt96_bfa1953c-4c82-4463-b772-6b871bcea9b8/machine-api-operator/0.log" Oct 02 12:54:27 crc kubenswrapper[4658]: I1002 12:54:27.429764 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:54:27 crc kubenswrapper[4658]: I1002 12:54:27.430847 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:54:27 crc kubenswrapper[4658]: I1002 12:54:27.430928 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 12:54:27 crc kubenswrapper[4658]: I1002 12:54:27.432200 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:54:27 crc kubenswrapper[4658]: I1002 12:54:27.432283 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" gracePeriod=600 Oct 02 12:54:27 crc kubenswrapper[4658]: E1002 12:54:27.575904 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:54:27 crc kubenswrapper[4658]: I1002 12:54:27.963804 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" exitCode=0 Oct 02 12:54:27 crc kubenswrapper[4658]: I1002 12:54:27.971385 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600"} Oct 02 12:54:27 crc kubenswrapper[4658]: I1002 12:54:27.971443 4658 scope.go:117] "RemoveContainer" containerID="8451f58522e374a93bce41d5d41b5de7d84687f96c2fae95ace55db12fae10c6" Oct 02 12:54:27 crc kubenswrapper[4658]: I1002 12:54:27.972881 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:54:27 crc kubenswrapper[4658]: E1002 12:54:27.973465 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:54:33 crc kubenswrapper[4658]: I1002 12:54:33.767984 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-cn57q_2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf/cert-manager-controller/0.log" Oct 02 12:54:34 crc kubenswrapper[4658]: I1002 12:54:34.032491 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-4jlqn_329487df-e7b0-4925-8c85-155c96453929/cert-manager-cainjector/0.log" Oct 02 12:54:34 crc kubenswrapper[4658]: I1002 12:54:34.110446 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-88pc4_648c22f9-bc82-4a6a-9b68-b9b557f0c243/cert-manager-webhook/0.log" Oct 02 12:54:38 crc kubenswrapper[4658]: I1002 12:54:38.949402 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:54:38 crc kubenswrapper[4658]: E1002 12:54:38.950195 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:54:46 crc kubenswrapper[4658]: I1002 12:54:46.067692 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-nczcp_53ded798-0460-49d4-8c75-f21907458150/nmstate-console-plugin/0.log" Oct 02 12:54:46 crc kubenswrapper[4658]: I1002 12:54:46.391403 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-g8smq_485529a7-2da9-40c3-adff-56109c78dbc1/kube-rbac-proxy/0.log" Oct 02 12:54:46 crc kubenswrapper[4658]: I1002 12:54:46.448448 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-rpw4d_3f8f0836-7d23-4df5-8658-79d424122ab3/nmstate-handler/0.log" Oct 02 12:54:46 crc kubenswrapper[4658]: I1002 12:54:46.465242 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-g8smq_485529a7-2da9-40c3-adff-56109c78dbc1/nmstate-metrics/0.log" Oct 02 12:54:46 crc kubenswrapper[4658]: I1002 12:54:46.631676 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-zvskn_08cea959-43c4-4ecc-b38d-2960b5d8180c/nmstate-operator/0.log" Oct 02 12:54:46 crc kubenswrapper[4658]: I1002 12:54:46.664531 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-hbfjg_bf546db5-7a99-4338-9c1e-0aecfdf1d7fb/nmstate-webhook/0.log" Oct 02 12:54:53 crc kubenswrapper[4658]: I1002 12:54:53.950467 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:54:53 crc kubenswrapper[4658]: E1002 12:54:53.951286 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.256535 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bsjg9_9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41/kube-rbac-proxy/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.497852 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bsjg9_9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41/controller/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.499010 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-frr-files/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.701631 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-frr-files/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.725212 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-metrics/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.731114 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-reloader/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.794505 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-reloader/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.936625 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-frr-files/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.959809 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-reloader/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.968510 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-metrics/0.log" Oct 02 12:55:00 crc kubenswrapper[4658]: I1002 12:55:00.968623 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-metrics/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.151092 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-reloader/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.158215 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/controller/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.187196 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-frr-files/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.190180 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-metrics/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.366708 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/kube-rbac-proxy/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.386079 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/frr-metrics/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.400994 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/kube-rbac-proxy-frr/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.588767 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-k4bcd_74c490f6-26be-4b3c-93f7-65b1625425a1/frr-k8s-webhook-server/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.592428 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/reloader/0.log" Oct 02 12:55:01 crc kubenswrapper[4658]: I1002 12:55:01.834472 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5c6495c478-cxldq_f9c60d31-755b-4e0e-888c-072203581d0d/manager/0.log" Oct 02 12:55:02 crc kubenswrapper[4658]: I1002 12:55:02.067468 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7db6cbc8bb-b4n8z_00a0ddd2-7f0b-4158-a95a-dd16a826ea1e/webhook-server/0.log" Oct 02 12:55:02 crc kubenswrapper[4658]: I1002 12:55:02.132961 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mrv9d_f053e253-c411-41a9-b81b-d7cf91cc9b8b/kube-rbac-proxy/0.log" Oct 02 12:55:02 crc kubenswrapper[4658]: I1002 12:55:02.846766 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mrv9d_f053e253-c411-41a9-b81b-d7cf91cc9b8b/speaker/0.log" Oct 02 12:55:02 crc kubenswrapper[4658]: I1002 12:55:02.922107 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/frr/0.log" Oct 02 12:55:06 crc kubenswrapper[4658]: I1002 12:55:06.949945 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:55:06 crc kubenswrapper[4658]: E1002 12:55:06.950693 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:55:13 crc kubenswrapper[4658]: I1002 12:55:13.946090 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/util/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.080744 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/util/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.104682 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/pull/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.124650 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/pull/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.273287 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/util/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.320246 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/extract/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.341671 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/pull/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.445542 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/util/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.624278 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/pull/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.652755 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/util/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.654908 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/pull/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.809255 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/util/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.810739 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/pull/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.837755 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/extract/0.log" Oct 02 12:55:14 crc kubenswrapper[4658]: I1002 12:55:14.962358 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-utilities/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.115202 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-content/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.115226 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-content/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.117514 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-utilities/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.287179 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-utilities/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.361242 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-content/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.556122 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/registry-server/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.568530 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-utilities/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.668552 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-content/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.670523 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-utilities/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.748311 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-content/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.906485 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-content/0.log" Oct 02 12:55:15 crc kubenswrapper[4658]: I1002 12:55:15.918929 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-utilities/0.log" Oct 02 12:55:16 crc kubenswrapper[4658]: I1002 12:55:16.124400 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/util/0.log" Oct 02 12:55:16 crc kubenswrapper[4658]: I1002 12:55:16.389799 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/util/0.log" Oct 02 12:55:16 crc kubenswrapper[4658]: I1002 12:55:16.391960 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/pull/0.log" Oct 02 12:55:16 crc kubenswrapper[4658]: I1002 12:55:16.433002 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/pull/0.log" Oct 02 12:55:16 crc kubenswrapper[4658]: I1002 12:55:16.647406 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/extract/0.log" Oct 02 12:55:16 crc kubenswrapper[4658]: I1002 12:55:16.651501 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/pull/0.log" Oct 02 12:55:16 crc kubenswrapper[4658]: I1002 12:55:16.671515 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/util/0.log" Oct 02 12:55:16 crc kubenswrapper[4658]: I1002 12:55:16.775472 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/registry-server/0.log" Oct 02 12:55:16 crc kubenswrapper[4658]: I1002 12:55:16.896270 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-clrx4_0cdd5f96-dd0d-4f77-8e41-83a8493dbca7/marketplace-operator/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.012564 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-utilities/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.180108 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-content/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.180178 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-utilities/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.200563 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-content/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.319607 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-utilities/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.373822 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-content/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.374440 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-utilities/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.524734 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/registry-server/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.633058 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-content/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.644921 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-content/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.670859 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-utilities/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.792212 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-content/0.log" Oct 02 12:55:17 crc kubenswrapper[4658]: I1002 12:55:17.792618 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-utilities/0.log" Oct 02 12:55:18 crc kubenswrapper[4658]: I1002 12:55:18.470328 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/registry-server/0.log" Oct 02 12:55:20 crc kubenswrapper[4658]: I1002 12:55:20.949531 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:55:20 crc kubenswrapper[4658]: E1002 12:55:20.950184 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:55:29 crc kubenswrapper[4658]: I1002 12:55:29.336196 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-wxwrk_6abb4e77-380e-45f9-94dd-0511e0194885/prometheus-operator/0.log" Oct 02 12:55:29 crc kubenswrapper[4658]: I1002 12:55:29.428314 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm_e8c24809-b49a-4a7d-9fd8-58f83c33a290/prometheus-operator-admission-webhook/0.log" Oct 02 12:55:29 crc kubenswrapper[4658]: I1002 12:55:29.463874 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g_8c427e03-4bb9-4dc4-a866-765e097e498f/prometheus-operator-admission-webhook/0.log" Oct 02 12:55:29 crc kubenswrapper[4658]: I1002 12:55:29.647331 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-b7497_6ae51e31-b742-4b5c-870a-d7bfc95151f1/operator/0.log" Oct 02 12:55:29 crc kubenswrapper[4658]: I1002 12:55:29.673516 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-4qk6g_79a71fa2-31f7-4ce5-9043-cdfad20543ec/perses-operator/0.log" Oct 02 12:55:34 crc kubenswrapper[4658]: I1002 12:55:34.949138 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:55:34 crc kubenswrapper[4658]: E1002 12:55:34.949981 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:55:46 crc kubenswrapper[4658]: I1002 12:55:46.949606 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:55:46 crc kubenswrapper[4658]: E1002 12:55:46.950446 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:55:59 crc kubenswrapper[4658]: I1002 12:55:59.956641 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:55:59 crc kubenswrapper[4658]: E1002 12:55:59.957567 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:56:11 crc kubenswrapper[4658]: I1002 12:56:11.952042 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:56:11 crc kubenswrapper[4658]: E1002 12:56:11.952853 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:56:22 crc kubenswrapper[4658]: I1002 12:56:22.948997 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:56:22 crc kubenswrapper[4658]: E1002 12:56:22.949489 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:56:36 crc kubenswrapper[4658]: I1002 12:56:36.949245 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:56:36 crc kubenswrapper[4658]: E1002 12:56:36.950220 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:56:50 crc kubenswrapper[4658]: I1002 12:56:50.949696 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:56:50 crc kubenswrapper[4658]: E1002 12:56:50.951175 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:57:01 crc kubenswrapper[4658]: I1002 12:57:01.950097 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:57:01 crc kubenswrapper[4658]: E1002 12:57:01.951079 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:57:14 crc kubenswrapper[4658]: I1002 12:57:14.950082 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:57:14 crc kubenswrapper[4658]: E1002 12:57:14.951059 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:57:28 crc kubenswrapper[4658]: I1002 12:57:28.948765 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:57:28 crc kubenswrapper[4658]: E1002 12:57:28.949583 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:57:39 crc kubenswrapper[4658]: I1002 12:57:39.954723 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:57:39 crc kubenswrapper[4658]: E1002 12:57:39.955789 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:57:41 crc kubenswrapper[4658]: I1002 12:57:41.979976 4658 generic.go:334] "Generic (PLEG): container finished" podID="64073b89-1a4e-4ef4-b876-f24d3148632c" containerID="75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c" exitCode=0 Oct 02 12:57:41 crc kubenswrapper[4658]: I1002 12:57:41.980011 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" event={"ID":"64073b89-1a4e-4ef4-b876-f24d3148632c","Type":"ContainerDied","Data":"75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c"} Oct 02 12:57:41 crc kubenswrapper[4658]: I1002 12:57:41.981059 4658 scope.go:117] "RemoveContainer" containerID="75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c" Oct 02 12:57:42 crc kubenswrapper[4658]: I1002 12:57:42.834905 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mtlb6_must-gather-fkzcz_64073b89-1a4e-4ef4-b876-f24d3148632c/gather/0.log" Oct 02 12:57:52 crc kubenswrapper[4658]: I1002 12:57:52.450252 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mtlb6/must-gather-fkzcz"] Oct 02 12:57:52 crc kubenswrapper[4658]: I1002 12:57:52.451082 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" podUID="64073b89-1a4e-4ef4-b876-f24d3148632c" containerName="copy" containerID="cri-o://b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0" gracePeriod=2 Oct 02 12:57:52 crc kubenswrapper[4658]: I1002 12:57:52.471817 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mtlb6/must-gather-fkzcz"] Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.054916 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mtlb6_must-gather-fkzcz_64073b89-1a4e-4ef4-b876-f24d3148632c/copy/0.log" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.055473 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.128355 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mtlb6_must-gather-fkzcz_64073b89-1a4e-4ef4-b876-f24d3148632c/copy/0.log" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.129022 4658 generic.go:334] "Generic (PLEG): container finished" podID="64073b89-1a4e-4ef4-b876-f24d3148632c" containerID="b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0" exitCode=143 Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.129090 4658 scope.go:117] "RemoveContainer" containerID="b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.129106 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mtlb6/must-gather-fkzcz" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.149662 4658 scope.go:117] "RemoveContainer" containerID="75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.200061 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64073b89-1a4e-4ef4-b876-f24d3148632c-must-gather-output\") pod \"64073b89-1a4e-4ef4-b876-f24d3148632c\" (UID: \"64073b89-1a4e-4ef4-b876-f24d3148632c\") " Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.200191 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx84m\" (UniqueName: \"kubernetes.io/projected/64073b89-1a4e-4ef4-b876-f24d3148632c-kube-api-access-jx84m\") pod \"64073b89-1a4e-4ef4-b876-f24d3148632c\" (UID: \"64073b89-1a4e-4ef4-b876-f24d3148632c\") " Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.212496 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64073b89-1a4e-4ef4-b876-f24d3148632c-kube-api-access-jx84m" (OuterVolumeSpecName: "kube-api-access-jx84m") pod "64073b89-1a4e-4ef4-b876-f24d3148632c" (UID: "64073b89-1a4e-4ef4-b876-f24d3148632c"). InnerVolumeSpecName "kube-api-access-jx84m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.246961 4658 scope.go:117] "RemoveContainer" containerID="b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0" Oct 02 12:57:53 crc kubenswrapper[4658]: E1002 12:57:53.247587 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0\": container with ID starting with b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0 not found: ID does not exist" containerID="b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.247628 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0"} err="failed to get container status \"b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0\": rpc error: code = NotFound desc = could not find container \"b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0\": container with ID starting with b93173c351ef4125155f09677170a84fbe2a4b791eaba7eb7870baa0de2278d0 not found: ID does not exist" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.247653 4658 scope.go:117] "RemoveContainer" containerID="75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c" Oct 02 12:57:53 crc kubenswrapper[4658]: E1002 12:57:53.247944 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c\": container with ID starting with 75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c not found: ID does not exist" containerID="75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.247970 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c"} err="failed to get container status \"75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c\": rpc error: code = NotFound desc = could not find container \"75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c\": container with ID starting with 75a59058bf6f607ca8d02f46c691891ee2c79bf4b95eb0b130de093fd77b130c not found: ID does not exist" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.302742 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx84m\" (UniqueName: \"kubernetes.io/projected/64073b89-1a4e-4ef4-b876-f24d3148632c-kube-api-access-jx84m\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.392569 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64073b89-1a4e-4ef4-b876-f24d3148632c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "64073b89-1a4e-4ef4-b876-f24d3148632c" (UID: "64073b89-1a4e-4ef4-b876-f24d3148632c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.405130 4658 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64073b89-1a4e-4ef4-b876-f24d3148632c-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:53 crc kubenswrapper[4658]: I1002 12:57:53.958815 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64073b89-1a4e-4ef4-b876-f24d3148632c" path="/var/lib/kubelet/pods/64073b89-1a4e-4ef4-b876-f24d3148632c/volumes" Oct 02 12:57:54 crc kubenswrapper[4658]: I1002 12:57:54.950446 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:57:54 crc kubenswrapper[4658]: E1002 12:57:54.950948 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:58:08 crc kubenswrapper[4658]: I1002 12:58:08.949639 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:58:08 crc kubenswrapper[4658]: E1002 12:58:08.950587 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.434114 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pssfg"] Oct 02 12:58:10 crc kubenswrapper[4658]: E1002 12:58:10.435552 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64073b89-1a4e-4ef4-b876-f24d3148632c" containerName="gather" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.435637 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="64073b89-1a4e-4ef4-b876-f24d3148632c" containerName="gather" Oct 02 12:58:10 crc kubenswrapper[4658]: E1002 12:58:10.435702 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64073b89-1a4e-4ef4-b876-f24d3148632c" containerName="copy" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.435760 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="64073b89-1a4e-4ef4-b876-f24d3148632c" containerName="copy" Oct 02 12:58:10 crc kubenswrapper[4658]: E1002 12:58:10.435838 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcdc0fd8-e055-4316-899f-6b6075a0eb92" containerName="container-00" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.435895 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcdc0fd8-e055-4316-899f-6b6075a0eb92" containerName="container-00" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.436371 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="64073b89-1a4e-4ef4-b876-f24d3148632c" containerName="copy" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.436466 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="64073b89-1a4e-4ef4-b876-f24d3148632c" containerName="gather" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.436550 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcdc0fd8-e055-4316-899f-6b6075a0eb92" containerName="container-00" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.438123 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.476768 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pssfg"] Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.515063 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-catalog-content\") pod \"certified-operators-pssfg\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.515141 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjcvq\" (UniqueName: \"kubernetes.io/projected/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-kube-api-access-xjcvq\") pod \"certified-operators-pssfg\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.515219 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-utilities\") pod \"certified-operators-pssfg\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.620449 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-catalog-content\") pod \"certified-operators-pssfg\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.620500 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjcvq\" (UniqueName: \"kubernetes.io/projected/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-kube-api-access-xjcvq\") pod \"certified-operators-pssfg\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.620535 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-utilities\") pod \"certified-operators-pssfg\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.621097 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-catalog-content\") pod \"certified-operators-pssfg\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.624524 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-utilities\") pod \"certified-operators-pssfg\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.646397 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjcvq\" (UniqueName: \"kubernetes.io/projected/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-kube-api-access-xjcvq\") pod \"certified-operators-pssfg\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:10 crc kubenswrapper[4658]: I1002 12:58:10.766254 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:11 crc kubenswrapper[4658]: I1002 12:58:11.283965 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pssfg"] Oct 02 12:58:11 crc kubenswrapper[4658]: I1002 12:58:11.307544 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssfg" event={"ID":"c6613bdf-a008-45a0-91f0-335b3f0ff7e4","Type":"ContainerStarted","Data":"2ace4d420719007eb5635c01b2c095c4cb24eb99dbb9778d44c0e5c7d88708ab"} Oct 02 12:58:12 crc kubenswrapper[4658]: I1002 12:58:12.317094 4658 generic.go:334] "Generic (PLEG): container finished" podID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerID="54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff" exitCode=0 Oct 02 12:58:12 crc kubenswrapper[4658]: I1002 12:58:12.317150 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssfg" event={"ID":"c6613bdf-a008-45a0-91f0-335b3f0ff7e4","Type":"ContainerDied","Data":"54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff"} Oct 02 12:58:12 crc kubenswrapper[4658]: I1002 12:58:12.318837 4658 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:58:14 crc kubenswrapper[4658]: I1002 12:58:14.338700 4658 generic.go:334] "Generic (PLEG): container finished" podID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerID="a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275" exitCode=0 Oct 02 12:58:14 crc kubenswrapper[4658]: I1002 12:58:14.338805 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssfg" event={"ID":"c6613bdf-a008-45a0-91f0-335b3f0ff7e4","Type":"ContainerDied","Data":"a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275"} Oct 02 12:58:16 crc kubenswrapper[4658]: I1002 12:58:16.360679 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssfg" event={"ID":"c6613bdf-a008-45a0-91f0-335b3f0ff7e4","Type":"ContainerStarted","Data":"d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a"} Oct 02 12:58:16 crc kubenswrapper[4658]: I1002 12:58:16.382095 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pssfg" podStartSLOduration=3.472767943 podStartE2EDuration="6.382071388s" podCreationTimestamp="2025-10-02 12:58:10 +0000 UTC" firstStartedPulling="2025-10-02 12:58:12.318635081 +0000 UTC m=+5973.209788648" lastFinishedPulling="2025-10-02 12:58:15.227938506 +0000 UTC m=+5976.119092093" observedRunningTime="2025-10-02 12:58:16.379142394 +0000 UTC m=+5977.270295961" watchObservedRunningTime="2025-10-02 12:58:16.382071388 +0000 UTC m=+5977.273224955" Oct 02 12:58:19 crc kubenswrapper[4658]: I1002 12:58:19.959835 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:58:19 crc kubenswrapper[4658]: E1002 12:58:19.960395 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:58:20 crc kubenswrapper[4658]: I1002 12:58:20.767131 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:20 crc kubenswrapper[4658]: I1002 12:58:20.767455 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:20 crc kubenswrapper[4658]: I1002 12:58:20.841566 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:21 crc kubenswrapper[4658]: I1002 12:58:21.452162 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:21 crc kubenswrapper[4658]: I1002 12:58:21.498179 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pssfg"] Oct 02 12:58:23 crc kubenswrapper[4658]: I1002 12:58:23.431930 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pssfg" podUID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerName="registry-server" containerID="cri-o://d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a" gracePeriod=2 Oct 02 12:58:23 crc kubenswrapper[4658]: I1002 12:58:23.965128 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.086368 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-utilities\") pod \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.086409 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjcvq\" (UniqueName: \"kubernetes.io/projected/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-kube-api-access-xjcvq\") pod \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.086604 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-catalog-content\") pod \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\" (UID: \"c6613bdf-a008-45a0-91f0-335b3f0ff7e4\") " Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.087402 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-utilities" (OuterVolumeSpecName: "utilities") pod "c6613bdf-a008-45a0-91f0-335b3f0ff7e4" (UID: "c6613bdf-a008-45a0-91f0-335b3f0ff7e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.093734 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-kube-api-access-xjcvq" (OuterVolumeSpecName: "kube-api-access-xjcvq") pod "c6613bdf-a008-45a0-91f0-335b3f0ff7e4" (UID: "c6613bdf-a008-45a0-91f0-335b3f0ff7e4"). InnerVolumeSpecName "kube-api-access-xjcvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.188963 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.189000 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjcvq\" (UniqueName: \"kubernetes.io/projected/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-kube-api-access-xjcvq\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.361178 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6613bdf-a008-45a0-91f0-335b3f0ff7e4" (UID: "c6613bdf-a008-45a0-91f0-335b3f0ff7e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.392444 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6613bdf-a008-45a0-91f0-335b3f0ff7e4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.444013 4658 generic.go:334] "Generic (PLEG): container finished" podID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerID="d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a" exitCode=0 Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.444055 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssfg" event={"ID":"c6613bdf-a008-45a0-91f0-335b3f0ff7e4","Type":"ContainerDied","Data":"d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a"} Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.444080 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pssfg" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.444089 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssfg" event={"ID":"c6613bdf-a008-45a0-91f0-335b3f0ff7e4","Type":"ContainerDied","Data":"2ace4d420719007eb5635c01b2c095c4cb24eb99dbb9778d44c0e5c7d88708ab"} Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.444111 4658 scope.go:117] "RemoveContainer" containerID="d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.472983 4658 scope.go:117] "RemoveContainer" containerID="a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.476419 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pssfg"] Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.488855 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pssfg"] Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.509882 4658 scope.go:117] "RemoveContainer" containerID="54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.538777 4658 scope.go:117] "RemoveContainer" containerID="d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a" Oct 02 12:58:24 crc kubenswrapper[4658]: E1002 12:58:24.539187 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a\": container with ID starting with d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a not found: ID does not exist" containerID="d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.539228 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a"} err="failed to get container status \"d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a\": rpc error: code = NotFound desc = could not find container \"d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a\": container with ID starting with d77546650f8a30916bea7c4c0c7e96cf5c35aeec003abcd67653280fa18e158a not found: ID does not exist" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.539254 4658 scope.go:117] "RemoveContainer" containerID="a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275" Oct 02 12:58:24 crc kubenswrapper[4658]: E1002 12:58:24.539977 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275\": container with ID starting with a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275 not found: ID does not exist" containerID="a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.540016 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275"} err="failed to get container status \"a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275\": rpc error: code = NotFound desc = could not find container \"a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275\": container with ID starting with a12e1dc7ed735dc4ab1a0ba37d2c54c982110ce21507296cac0d5e4dd0fab275 not found: ID does not exist" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.540036 4658 scope.go:117] "RemoveContainer" containerID="54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff" Oct 02 12:58:24 crc kubenswrapper[4658]: E1002 12:58:24.540492 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff\": container with ID starting with 54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff not found: ID does not exist" containerID="54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff" Oct 02 12:58:24 crc kubenswrapper[4658]: I1002 12:58:24.540591 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff"} err="failed to get container status \"54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff\": rpc error: code = NotFound desc = could not find container \"54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff\": container with ID starting with 54f9747e63b5feabe977e7295e0d8c2f231f82a4ec18e88977db025fb83b4eff not found: ID does not exist" Oct 02 12:58:25 crc kubenswrapper[4658]: I1002 12:58:25.727481 4658 scope.go:117] "RemoveContainer" containerID="951e80dc1885e76d0fc23c4154668e536d592bf788e35cfbba65adbd9b8b63b7" Oct 02 12:58:25 crc kubenswrapper[4658]: I1002 12:58:25.759239 4658 scope.go:117] "RemoveContainer" containerID="46861242d3da7f470306a478cb4e9df161cca6a772a91b1d062152776c7c7ccc" Oct 02 12:58:25 crc kubenswrapper[4658]: I1002 12:58:25.836096 4658 scope.go:117] "RemoveContainer" containerID="0aea70f271c4a30b416a0c12e8c0fcf9814aeb02fdaa5968883ee8a6cdda3cfd" Oct 02 12:58:25 crc kubenswrapper[4658]: I1002 12:58:25.962791 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" path="/var/lib/kubelet/pods/c6613bdf-a008-45a0-91f0-335b3f0ff7e4/volumes" Oct 02 12:58:32 crc kubenswrapper[4658]: I1002 12:58:32.951046 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:58:32 crc kubenswrapper[4658]: E1002 12:58:32.956208 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.692290 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5fsxr/must-gather-n4cj7"] Oct 02 12:58:33 crc kubenswrapper[4658]: E1002 12:58:33.692872 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerName="extract-utilities" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.692897 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerName="extract-utilities" Oct 02 12:58:33 crc kubenswrapper[4658]: E1002 12:58:33.692919 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerName="extract-content" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.692928 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerName="extract-content" Oct 02 12:58:33 crc kubenswrapper[4658]: E1002 12:58:33.692944 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerName="registry-server" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.692951 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerName="registry-server" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.693177 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6613bdf-a008-45a0-91f0-335b3f0ff7e4" containerName="registry-server" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.694574 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.701132 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5fsxr"/"openshift-service-ca.crt" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.712192 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5fsxr/must-gather-n4cj7"] Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.714730 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5fsxr"/"kube-root-ca.crt" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.790087 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlcnb\" (UniqueName: \"kubernetes.io/projected/75a3fbaf-6798-43b3-a912-fb1afa675811-kube-api-access-wlcnb\") pod \"must-gather-n4cj7\" (UID: \"75a3fbaf-6798-43b3-a912-fb1afa675811\") " pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.790178 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a3fbaf-6798-43b3-a912-fb1afa675811-must-gather-output\") pod \"must-gather-n4cj7\" (UID: \"75a3fbaf-6798-43b3-a912-fb1afa675811\") " pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.891637 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlcnb\" (UniqueName: \"kubernetes.io/projected/75a3fbaf-6798-43b3-a912-fb1afa675811-kube-api-access-wlcnb\") pod \"must-gather-n4cj7\" (UID: \"75a3fbaf-6798-43b3-a912-fb1afa675811\") " pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.891697 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a3fbaf-6798-43b3-a912-fb1afa675811-must-gather-output\") pod \"must-gather-n4cj7\" (UID: \"75a3fbaf-6798-43b3-a912-fb1afa675811\") " pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.892274 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a3fbaf-6798-43b3-a912-fb1afa675811-must-gather-output\") pod \"must-gather-n4cj7\" (UID: \"75a3fbaf-6798-43b3-a912-fb1afa675811\") " pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 12:58:33 crc kubenswrapper[4658]: I1002 12:58:33.915285 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlcnb\" (UniqueName: \"kubernetes.io/projected/75a3fbaf-6798-43b3-a912-fb1afa675811-kube-api-access-wlcnb\") pod \"must-gather-n4cj7\" (UID: \"75a3fbaf-6798-43b3-a912-fb1afa675811\") " pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 12:58:34 crc kubenswrapper[4658]: I1002 12:58:34.016768 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 12:58:34 crc kubenswrapper[4658]: I1002 12:58:34.479019 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5fsxr/must-gather-n4cj7"] Oct 02 12:58:34 crc kubenswrapper[4658]: I1002 12:58:34.551806 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" event={"ID":"75a3fbaf-6798-43b3-a912-fb1afa675811","Type":"ContainerStarted","Data":"b72396a01c933f87b5cf8cac1cf92b2f701646a91185104aaeb2399332c00b74"} Oct 02 12:58:35 crc kubenswrapper[4658]: I1002 12:58:35.565247 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" event={"ID":"75a3fbaf-6798-43b3-a912-fb1afa675811","Type":"ContainerStarted","Data":"66f606f3febc06db27b903e58569c21c86dd0528ba37027567cbe73a9ee18bee"} Oct 02 12:58:35 crc kubenswrapper[4658]: I1002 12:58:35.565590 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" event={"ID":"75a3fbaf-6798-43b3-a912-fb1afa675811","Type":"ContainerStarted","Data":"47a4ad3fed7b9364d72a9b95cfe0e1328567085514ae69406aa758e646c1dd3c"} Oct 02 12:58:35 crc kubenswrapper[4658]: I1002 12:58:35.586605 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" podStartSLOduration=2.586587342 podStartE2EDuration="2.586587342s" podCreationTimestamp="2025-10-02 12:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:58:35.582920044 +0000 UTC m=+5996.474073631" watchObservedRunningTime="2025-10-02 12:58:35.586587342 +0000 UTC m=+5996.477740909" Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.412053 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5fsxr/crc-debug-5kv9d"] Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.414959 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.417285 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5fsxr"/"default-dockercfg-q9qdm" Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.509374 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9c9\" (UniqueName: \"kubernetes.io/projected/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-kube-api-access-gm9c9\") pod \"crc-debug-5kv9d\" (UID: \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\") " pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.509518 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-host\") pod \"crc-debug-5kv9d\" (UID: \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\") " pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.610858 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9c9\" (UniqueName: \"kubernetes.io/projected/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-kube-api-access-gm9c9\") pod \"crc-debug-5kv9d\" (UID: \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\") " pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.610918 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-host\") pod \"crc-debug-5kv9d\" (UID: \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\") " pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.611018 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-host\") pod \"crc-debug-5kv9d\" (UID: \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\") " pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.629186 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9c9\" (UniqueName: \"kubernetes.io/projected/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-kube-api-access-gm9c9\") pod \"crc-debug-5kv9d\" (UID: \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\") " pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 12:58:38 crc kubenswrapper[4658]: I1002 12:58:38.732456 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 12:58:38 crc kubenswrapper[4658]: W1002 12:58:38.773834 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15102a44_06a6_4ecb_b7f0_f094d14e6c3c.slice/crio-d24c0d9b80c80536c4bd87bffa0fa62b41ac7c3c9201afd64fb6c4c45e23a6ce WatchSource:0}: Error finding container d24c0d9b80c80536c4bd87bffa0fa62b41ac7c3c9201afd64fb6c4c45e23a6ce: Status 404 returned error can't find the container with id d24c0d9b80c80536c4bd87bffa0fa62b41ac7c3c9201afd64fb6c4c45e23a6ce Oct 02 12:58:39 crc kubenswrapper[4658]: I1002 12:58:39.611019 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" event={"ID":"15102a44-06a6-4ecb-b7f0-f094d14e6c3c","Type":"ContainerStarted","Data":"906e94a8ae08eabf4cebdbb254079d0d22d6ef6beec0f6f90580006348d3ee4b"} Oct 02 12:58:39 crc kubenswrapper[4658]: I1002 12:58:39.611613 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" event={"ID":"15102a44-06a6-4ecb-b7f0-f094d14e6c3c","Type":"ContainerStarted","Data":"d24c0d9b80c80536c4bd87bffa0fa62b41ac7c3c9201afd64fb6c4c45e23a6ce"} Oct 02 12:58:39 crc kubenswrapper[4658]: I1002 12:58:39.631854 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" podStartSLOduration=1.631834998 podStartE2EDuration="1.631834998s" podCreationTimestamp="2025-10-02 12:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:58:39.624314097 +0000 UTC m=+6000.515467664" watchObservedRunningTime="2025-10-02 12:58:39.631834998 +0000 UTC m=+6000.522988565" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.342048 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mf722"] Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.349540 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.353645 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf722"] Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.531158 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-catalog-content\") pod \"redhat-marketplace-mf722\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.531225 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scg2q\" (UniqueName: \"kubernetes.io/projected/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-kube-api-access-scg2q\") pod \"redhat-marketplace-mf722\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.531395 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-utilities\") pod \"redhat-marketplace-mf722\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.633439 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-catalog-content\") pod \"redhat-marketplace-mf722\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.633498 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scg2q\" (UniqueName: \"kubernetes.io/projected/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-kube-api-access-scg2q\") pod \"redhat-marketplace-mf722\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.633593 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-utilities\") pod \"redhat-marketplace-mf722\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.634087 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-catalog-content\") pod \"redhat-marketplace-mf722\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.634341 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-utilities\") pod \"redhat-marketplace-mf722\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.658040 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scg2q\" (UniqueName: \"kubernetes.io/projected/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-kube-api-access-scg2q\") pod \"redhat-marketplace-mf722\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:44 crc kubenswrapper[4658]: I1002 12:58:44.686254 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:45 crc kubenswrapper[4658]: I1002 12:58:45.215451 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf722"] Oct 02 12:58:45 crc kubenswrapper[4658]: I1002 12:58:45.664958 4658 generic.go:334] "Generic (PLEG): container finished" podID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerID="1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff" exitCode=0 Oct 02 12:58:45 crc kubenswrapper[4658]: I1002 12:58:45.665038 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf722" event={"ID":"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57","Type":"ContainerDied","Data":"1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff"} Oct 02 12:58:45 crc kubenswrapper[4658]: I1002 12:58:45.665278 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf722" event={"ID":"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57","Type":"ContainerStarted","Data":"3eeb28821358bccd0bdf714eb81fd486cb7b7a49b6af2220f4f08b29dc640b79"} Oct 02 12:58:47 crc kubenswrapper[4658]: I1002 12:58:47.949792 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:58:47 crc kubenswrapper[4658]: E1002 12:58:47.950702 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:58:48 crc kubenswrapper[4658]: I1002 12:58:48.706827 4658 generic.go:334] "Generic (PLEG): container finished" podID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerID="e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f" exitCode=0 Oct 02 12:58:48 crc kubenswrapper[4658]: I1002 12:58:48.706885 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf722" event={"ID":"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57","Type":"ContainerDied","Data":"e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f"} Oct 02 12:58:49 crc kubenswrapper[4658]: I1002 12:58:49.718793 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf722" event={"ID":"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57","Type":"ContainerStarted","Data":"5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719"} Oct 02 12:58:49 crc kubenswrapper[4658]: I1002 12:58:49.736697 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mf722" podStartSLOduration=2.310114149 podStartE2EDuration="5.736668542s" podCreationTimestamp="2025-10-02 12:58:44 +0000 UTC" firstStartedPulling="2025-10-02 12:58:45.685699343 +0000 UTC m=+6006.576852900" lastFinishedPulling="2025-10-02 12:58:49.112253726 +0000 UTC m=+6010.003407293" observedRunningTime="2025-10-02 12:58:49.734745281 +0000 UTC m=+6010.625898848" watchObservedRunningTime="2025-10-02 12:58:49.736668542 +0000 UTC m=+6010.627822109" Oct 02 12:58:54 crc kubenswrapper[4658]: I1002 12:58:54.686851 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:54 crc kubenswrapper[4658]: I1002 12:58:54.688476 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:54 crc kubenswrapper[4658]: I1002 12:58:54.743664 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:54 crc kubenswrapper[4658]: I1002 12:58:54.828716 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:54 crc kubenswrapper[4658]: I1002 12:58:54.983200 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf722"] Oct 02 12:58:56 crc kubenswrapper[4658]: I1002 12:58:56.779956 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mf722" podUID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerName="registry-server" containerID="cri-o://5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719" gracePeriod=2 Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.251399 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.385238 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-catalog-content\") pod \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.385790 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scg2q\" (UniqueName: \"kubernetes.io/projected/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-kube-api-access-scg2q\") pod \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.385903 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-utilities\") pod \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\" (UID: \"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57\") " Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.386589 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-utilities" (OuterVolumeSpecName: "utilities") pod "cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" (UID: "cd22ceaa-8dee-4d0f-91d3-f57a8a51be57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.398311 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-kube-api-access-scg2q" (OuterVolumeSpecName: "kube-api-access-scg2q") pod "cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" (UID: "cd22ceaa-8dee-4d0f-91d3-f57a8a51be57"). InnerVolumeSpecName "kube-api-access-scg2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.401615 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" (UID: "cd22ceaa-8dee-4d0f-91d3-f57a8a51be57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.488742 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.488789 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scg2q\" (UniqueName: \"kubernetes.io/projected/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-kube-api-access-scg2q\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.488809 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.792259 4658 generic.go:334] "Generic (PLEG): container finished" podID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerID="5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719" exitCode=0 Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.792332 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf722" event={"ID":"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57","Type":"ContainerDied","Data":"5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719"} Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.792356 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf722" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.792379 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf722" event={"ID":"cd22ceaa-8dee-4d0f-91d3-f57a8a51be57","Type":"ContainerDied","Data":"3eeb28821358bccd0bdf714eb81fd486cb7b7a49b6af2220f4f08b29dc640b79"} Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.792397 4658 scope.go:117] "RemoveContainer" containerID="5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.816270 4658 scope.go:117] "RemoveContainer" containerID="e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.844934 4658 scope.go:117] "RemoveContainer" containerID="1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.860256 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf722"] Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.871501 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf722"] Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.889563 4658 scope.go:117] "RemoveContainer" containerID="5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719" Oct 02 12:58:57 crc kubenswrapper[4658]: E1002 12:58:57.890659 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719\": container with ID starting with 5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719 not found: ID does not exist" containerID="5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.890704 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719"} err="failed to get container status \"5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719\": rpc error: code = NotFound desc = could not find container \"5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719\": container with ID starting with 5eaa4c0c8ed83c7b8a84a6481b17b87f79b38129ea886d3b727c854943bfb719 not found: ID does not exist" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.890729 4658 scope.go:117] "RemoveContainer" containerID="e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f" Oct 02 12:58:57 crc kubenswrapper[4658]: E1002 12:58:57.891209 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f\": container with ID starting with e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f not found: ID does not exist" containerID="e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.891239 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f"} err="failed to get container status \"e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f\": rpc error: code = NotFound desc = could not find container \"e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f\": container with ID starting with e3249830dbdbace66f6062e90485f57b1cb35ff198a6248eddff7a9a9f81386f not found: ID does not exist" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.891260 4658 scope.go:117] "RemoveContainer" containerID="1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff" Oct 02 12:58:57 crc kubenswrapper[4658]: E1002 12:58:57.891519 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff\": container with ID starting with 1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff not found: ID does not exist" containerID="1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.891539 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff"} err="failed to get container status \"1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff\": rpc error: code = NotFound desc = could not find container \"1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff\": container with ID starting with 1b3b43787a85e5403f59206d3a940de3e7dce249e5052525e3cc4ffd665dadff not found: ID does not exist" Oct 02 12:58:57 crc kubenswrapper[4658]: I1002 12:58:57.962546 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" path="/var/lib/kubelet/pods/cd22ceaa-8dee-4d0f-91d3-f57a8a51be57/volumes" Oct 02 12:59:02 crc kubenswrapper[4658]: I1002 12:59:02.950782 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:59:02 crc kubenswrapper[4658]: E1002 12:59:02.951440 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:59:17 crc kubenswrapper[4658]: I1002 12:59:17.950061 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:59:17 crc kubenswrapper[4658]: E1002 12:59:17.950802 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 12:59:30 crc kubenswrapper[4658]: I1002 12:59:30.949792 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 12:59:32 crc kubenswrapper[4658]: I1002 12:59:32.143913 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"af02ac7a68f71b372c931df2b7dcf958edacd7a1fa88322aa09952c29cef20b7"} Oct 02 12:59:54 crc kubenswrapper[4658]: I1002 12:59:54.988975 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbc95469d-r9kbr_456bb611-ccbc-4d1b-94bf-2ceb7d8345e3/barbican-api/0.log" Oct 02 12:59:55 crc kubenswrapper[4658]: I1002 12:59:55.030926 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbc95469d-r9kbr_456bb611-ccbc-4d1b-94bf-2ceb7d8345e3/barbican-api-log/0.log" Oct 02 12:59:55 crc kubenswrapper[4658]: I1002 12:59:55.192919 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-54ff5bbf66-pmxfv_ed9f1355-f34e-479c-8030-c2848860beb6/barbican-keystone-listener/0.log" Oct 02 12:59:55 crc kubenswrapper[4658]: I1002 12:59:55.242561 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-54ff5bbf66-pmxfv_ed9f1355-f34e-479c-8030-c2848860beb6/barbican-keystone-listener-log/0.log" Oct 02 12:59:55 crc kubenswrapper[4658]: I1002 12:59:55.396014 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-698b689fd7-9wp8g_e5fc61f1-3fdf-430c-890e-4e220859285b/barbican-worker/0.log" Oct 02 12:59:55 crc kubenswrapper[4658]: I1002 12:59:55.459499 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-698b689fd7-9wp8g_e5fc61f1-3fdf-430c-890e-4e220859285b/barbican-worker-log/0.log" Oct 02 12:59:55 crc kubenswrapper[4658]: I1002 12:59:55.632384 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-kbgxx_3e768ea4-04c3-4825-9431-a37f41f34a01/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:59:55 crc kubenswrapper[4658]: I1002 12:59:55.827863 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4ef5828e-3cb4-4a6d-ba04-f474234450d3/ceilometer-central-agent/0.log" Oct 02 12:59:55 crc kubenswrapper[4658]: I1002 12:59:55.858879 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4ef5828e-3cb4-4a6d-ba04-f474234450d3/ceilometer-notification-agent/0.log" Oct 02 12:59:55 crc kubenswrapper[4658]: I1002 12:59:55.885108 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4ef5828e-3cb4-4a6d-ba04-f474234450d3/proxy-httpd/0.log" Oct 02 12:59:56 crc kubenswrapper[4658]: I1002 12:59:56.015371 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4ef5828e-3cb4-4a6d-ba04-f474234450d3/sg-core/0.log" Oct 02 12:59:56 crc kubenswrapper[4658]: I1002 12:59:56.160154 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ca5cc232-0768-4541-b654-03a61ffd7ddc/cinder-api/0.log" Oct 02 12:59:56 crc kubenswrapper[4658]: I1002 12:59:56.265968 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ca5cc232-0768-4541-b654-03a61ffd7ddc/cinder-api-log/0.log" Oct 02 12:59:56 crc kubenswrapper[4658]: I1002 12:59:56.423255 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_efbe9a47-e907-4393-8f6a-9e1a824383f4/cinder-scheduler/0.log" Oct 02 12:59:56 crc kubenswrapper[4658]: I1002 12:59:56.510825 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_efbe9a47-e907-4393-8f6a-9e1a824383f4/probe/0.log" Oct 02 12:59:56 crc kubenswrapper[4658]: I1002 12:59:56.675820 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-dzfcn_6eed4da6-fdf5-4db6-9e72-1d3052a54482/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:59:56 crc kubenswrapper[4658]: I1002 12:59:56.786753 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kzzml_09073a04-723b-4564-8f3a-efbc628cb7ef/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:59:56 crc kubenswrapper[4658]: I1002 12:59:56.982130 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-lgf5p_c5792ae0-4758-472c-94b6-b4f313cc3462/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:59:57 crc kubenswrapper[4658]: I1002 12:59:57.162746 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6bcf8b9d95-6qc52_d2ab47cf-8dcb-4517-b4de-a064181594e0/init/0.log" Oct 02 12:59:57 crc kubenswrapper[4658]: I1002 12:59:57.355851 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6bcf8b9d95-6qc52_d2ab47cf-8dcb-4517-b4de-a064181594e0/init/0.log" Oct 02 12:59:57 crc kubenswrapper[4658]: I1002 12:59:57.591467 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6bcf8b9d95-6qc52_d2ab47cf-8dcb-4517-b4de-a064181594e0/dnsmasq-dns/0.log" Oct 02 12:59:57 crc kubenswrapper[4658]: I1002 12:59:57.647906 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6b65f_43792b79-e840-4c83-b2b9-8068765b000a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:59:57 crc kubenswrapper[4658]: I1002 12:59:57.857183 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_67f8b15f-e190-40d6-8b7b-e8ba932f00f9/glance-httpd/0.log" Oct 02 12:59:57 crc kubenswrapper[4658]: I1002 12:59:57.898206 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_67f8b15f-e190-40d6-8b7b-e8ba932f00f9/glance-log/0.log" Oct 02 12:59:58 crc kubenswrapper[4658]: I1002 12:59:58.112148 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d6306f11-af13-4078-ad43-b00e333855b1/glance-httpd/0.log" Oct 02 12:59:58 crc kubenswrapper[4658]: I1002 12:59:58.151670 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d6306f11-af13-4078-ad43-b00e333855b1/glance-log/0.log" Oct 02 12:59:58 crc kubenswrapper[4658]: I1002 12:59:58.508318 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-776f4bfd7b-cm7vj_02408c48-14d8-4a7b-8ebf-79fd2fa1b924/horizon/0.log" Oct 02 12:59:58 crc kubenswrapper[4658]: I1002 12:59:58.515338 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-776f4bfd7b-cm7vj_02408c48-14d8-4a7b-8ebf-79fd2fa1b924/horizon/1.log" Oct 02 12:59:58 crc kubenswrapper[4658]: I1002 12:59:58.819482 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vcxtq_8d5900ee-9fca-4a00-8343-b51c6728627d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:59:58 crc kubenswrapper[4658]: I1002 12:59:58.858580 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xk4g7_59aa0d09-3a44-4e0a-b2d2-7f297a223854/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:59:59 crc kubenswrapper[4658]: I1002 12:59:59.251703 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-776f4bfd7b-cm7vj_02408c48-14d8-4a7b-8ebf-79fd2fa1b924/horizon-log/0.log" Oct 02 12:59:59 crc kubenswrapper[4658]: I1002 12:59:59.256499 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29323441-htvr7_a39e700e-3d2a-4deb-8ab5-ad53c0cf8276/keystone-cron/0.log" Oct 02 12:59:59 crc kubenswrapper[4658]: I1002 12:59:59.423668 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f67801c0-f438-43ae-a45b-c2870b64f553/kube-state-metrics/0.log" Oct 02 12:59:59 crc kubenswrapper[4658]: I1002 12:59:59.532526 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7db8df9d95-jgkgn_e57e6b14-51e6-4efb-ba74-8e57b5e3aa72/keystone-api/0.log" Oct 02 12:59:59 crc kubenswrapper[4658]: I1002 12:59:59.663725 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-59fjq_074ed90b-9bda-4d7f-819d-41f3e7569ac4/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.152199 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp"] Oct 02 13:00:00 crc kubenswrapper[4658]: E1002 13:00:00.152649 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerName="registry-server" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.152661 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerName="registry-server" Oct 02 13:00:00 crc kubenswrapper[4658]: E1002 13:00:00.152693 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerName="extract-content" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.152698 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerName="extract-content" Oct 02 13:00:00 crc kubenswrapper[4658]: E1002 13:00:00.152711 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerName="extract-utilities" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.152718 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerName="extract-utilities" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.152937 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd22ceaa-8dee-4d0f-91d3-f57a8a51be57" containerName="registry-server" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.154320 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.158749 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.158825 4658 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.180281 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp"] Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.190125 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8tmn\" (UniqueName: \"kubernetes.io/projected/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-kube-api-access-l8tmn\") pod \"collect-profiles-29323500-sxcpp\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.190226 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-config-volume\") pod \"collect-profiles-29323500-sxcpp\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.190323 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-secret-volume\") pod \"collect-profiles-29323500-sxcpp\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.205617 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6989c4ffd5-z7vdb_299ba238-fcb8-4f4b-94ea-73ac08404680/neutron-httpd/0.log" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.236889 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fljlb_ce4527ad-6f4e-4ea8-b0ec-35cdc554a42f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.292327 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8tmn\" (UniqueName: \"kubernetes.io/projected/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-kube-api-access-l8tmn\") pod \"collect-profiles-29323500-sxcpp\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.292968 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-config-volume\") pod \"collect-profiles-29323500-sxcpp\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.293163 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-secret-volume\") pod \"collect-profiles-29323500-sxcpp\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.294605 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-config-volume\") pod \"collect-profiles-29323500-sxcpp\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.302895 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6989c4ffd5-z7vdb_299ba238-fcb8-4f4b-94ea-73ac08404680/neutron-api/0.log" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.312822 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-secret-volume\") pod \"collect-profiles-29323500-sxcpp\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.314173 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8tmn\" (UniqueName: \"kubernetes.io/projected/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-kube-api-access-l8tmn\") pod \"collect-profiles-29323500-sxcpp\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.495935 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:00 crc kubenswrapper[4658]: I1002 13:00:00.989057 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp"] Oct 02 13:00:01 crc kubenswrapper[4658]: I1002 13:00:01.435409 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8441c161-18f6-46d9-a327-ac3857d077d2/nova-cell0-conductor-conductor/0.log" Oct 02 13:00:01 crc kubenswrapper[4658]: I1002 13:00:01.452631 4658 generic.go:334] "Generic (PLEG): container finished" podID="1ae6f190-a3ff-4558-be4a-cdab4b592e9b" containerID="ccf2ddc4ca83c5b6839199ecae74b38d038dd105d356c6d12554ea13fe25c4ef" exitCode=0 Oct 02 13:00:01 crc kubenswrapper[4658]: I1002 13:00:01.452672 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" event={"ID":"1ae6f190-a3ff-4558-be4a-cdab4b592e9b","Type":"ContainerDied","Data":"ccf2ddc4ca83c5b6839199ecae74b38d038dd105d356c6d12554ea13fe25c4ef"} Oct 02 13:00:01 crc kubenswrapper[4658]: I1002 13:00:01.452699 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" event={"ID":"1ae6f190-a3ff-4558-be4a-cdab4b592e9b","Type":"ContainerStarted","Data":"456a56bbb1af0ad3ac3fe4fd20db337082329382f6688761a70c058e2778cc18"} Oct 02 13:00:02 crc kubenswrapper[4658]: I1002 13:00:02.013302 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2b26ff3c-8765-4911-aee2-54a863e4fd7c/nova-api-log/0.log" Oct 02 13:00:02 crc kubenswrapper[4658]: I1002 13:00:02.100796 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d623c2ea-e4e8-4031-af93-35f76f08dba2/nova-cell1-conductor-conductor/0.log" Oct 02 13:00:02 crc kubenswrapper[4658]: I1002 13:00:02.516176 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2b26ff3c-8765-4911-aee2-54a863e4fd7c/nova-api-api/0.log" Oct 02 13:00:02 crc kubenswrapper[4658]: I1002 13:00:02.617329 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7e69ac9b-be4b-4d88-bf64-06f4ca3966ba/nova-cell1-novncproxy-novncproxy/0.log" Oct 02 13:00:02 crc kubenswrapper[4658]: I1002 13:00:02.848187 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qg2dq_4d537487-cd7a-43bd-ba29-fc9df6af7913/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:02 crc kubenswrapper[4658]: I1002 13:00:02.879631 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.002474 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f818de7d-6833-4011-aded-a3de906237c4/nova-metadata-log/0.log" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.047391 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-secret-volume\") pod \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.047508 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8tmn\" (UniqueName: \"kubernetes.io/projected/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-kube-api-access-l8tmn\") pod \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.047678 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-config-volume\") pod \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\" (UID: \"1ae6f190-a3ff-4558-be4a-cdab4b592e9b\") " Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.048938 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ae6f190-a3ff-4558-be4a-cdab4b592e9b" (UID: "1ae6f190-a3ff-4558-be4a-cdab4b592e9b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.056896 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ae6f190-a3ff-4558-be4a-cdab4b592e9b" (UID: "1ae6f190-a3ff-4558-be4a-cdab4b592e9b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.068825 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-kube-api-access-l8tmn" (OuterVolumeSpecName: "kube-api-access-l8tmn") pod "1ae6f190-a3ff-4558-be4a-cdab4b592e9b" (UID: "1ae6f190-a3ff-4558-be4a-cdab4b592e9b"). InnerVolumeSpecName "kube-api-access-l8tmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.149954 4658 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.149993 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8tmn\" (UniqueName: \"kubernetes.io/projected/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-kube-api-access-l8tmn\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.150009 4658 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae6f190-a3ff-4558-be4a-cdab4b592e9b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.473416 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" event={"ID":"1ae6f190-a3ff-4558-be4a-cdab4b592e9b","Type":"ContainerDied","Data":"456a56bbb1af0ad3ac3fe4fd20db337082329382f6688761a70c058e2778cc18"} Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.473462 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="456a56bbb1af0ad3ac3fe4fd20db337082329382f6688761a70c058e2778cc18" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.473520 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-sxcpp" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.538069 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ecaec123-d0cf-493f-bee4-b32cd4f084bf/mysql-bootstrap/0.log" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.580053 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c066a72f-72df-47f5-b481-12ba73cb8d5f/nova-scheduler-scheduler/0.log" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.777784 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ecaec123-d0cf-493f-bee4-b32cd4f084bf/mysql-bootstrap/0.log" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.795579 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ecaec123-d0cf-493f-bee4-b32cd4f084bf/galera/0.log" Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.961863 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg"] Oct 02 13:00:03 crc kubenswrapper[4658]: I1002 13:00:03.976168 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-k26jg"] Oct 02 13:00:04 crc kubenswrapper[4658]: I1002 13:00:04.063696 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_590179b8-356d-4392-bab5-037103481383/mysql-bootstrap/0.log" Oct 02 13:00:04 crc kubenswrapper[4658]: I1002 13:00:04.255668 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_590179b8-356d-4392-bab5-037103481383/galera/0.log" Oct 02 13:00:04 crc kubenswrapper[4658]: I1002 13:00:04.326795 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_590179b8-356d-4392-bab5-037103481383/mysql-bootstrap/0.log" Oct 02 13:00:04 crc kubenswrapper[4658]: I1002 13:00:04.543833 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_53d4842f-7f97-4191-bcea-c8076517503f/openstackclient/0.log" Oct 02 13:00:04 crc kubenswrapper[4658]: I1002 13:00:04.771143 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-h2htr_ed2f1df6-db7a-483e-a80d-298f12a389c8/ovn-controller/0.log" Oct 02 13:00:05 crc kubenswrapper[4658]: I1002 13:00:05.010497 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rjq6k_313d8a11-a864-4fe8-b083-cc3f713cd4f7/openstack-network-exporter/0.log" Oct 02 13:00:05 crc kubenswrapper[4658]: I1002 13:00:05.282634 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbnj8_ff110d7e-a1dd-4a53-99c8-995af4a9d039/ovsdb-server-init/0.log" Oct 02 13:00:05 crc kubenswrapper[4658]: I1002 13:00:05.510152 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f818de7d-6833-4011-aded-a3de906237c4/nova-metadata-metadata/0.log" Oct 02 13:00:05 crc kubenswrapper[4658]: I1002 13:00:05.538566 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbnj8_ff110d7e-a1dd-4a53-99c8-995af4a9d039/ovs-vswitchd/0.log" Oct 02 13:00:05 crc kubenswrapper[4658]: I1002 13:00:05.551813 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbnj8_ff110d7e-a1dd-4a53-99c8-995af4a9d039/ovsdb-server-init/0.log" Oct 02 13:00:05 crc kubenswrapper[4658]: I1002 13:00:05.694271 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbnj8_ff110d7e-a1dd-4a53-99c8-995af4a9d039/ovsdb-server/0.log" Oct 02 13:00:05 crc kubenswrapper[4658]: I1002 13:00:05.831217 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-gkwgw_3e5fa727-3d1f-4293-a2c2-33ba1f10ae2b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:05 crc kubenswrapper[4658]: I1002 13:00:05.962615 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221bb351-32c3-4da4-8cb1-92f3ec37e89d" path="/var/lib/kubelet/pods/221bb351-32c3-4da4-8cb1-92f3ec37e89d/volumes" Oct 02 13:00:06 crc kubenswrapper[4658]: I1002 13:00:06.006282 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_57cd238e-33f1-4536-bcf1-1ca7e57a141a/openstack-network-exporter/0.log" Oct 02 13:00:06 crc kubenswrapper[4658]: I1002 13:00:06.069370 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_57cd238e-33f1-4536-bcf1-1ca7e57a141a/ovn-northd/0.log" Oct 02 13:00:06 crc kubenswrapper[4658]: I1002 13:00:06.197777 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e17b8e1f-e0a9-4648-b16b-1f62fa63d507/openstack-network-exporter/0.log" Oct 02 13:00:06 crc kubenswrapper[4658]: I1002 13:00:06.279190 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e17b8e1f-e0a9-4648-b16b-1f62fa63d507/ovsdbserver-nb/0.log" Oct 02 13:00:06 crc kubenswrapper[4658]: I1002 13:00:06.382574 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_44a349ce-b770-4e0a-bc23-afb9bdea6eba/openstack-network-exporter/0.log" Oct 02 13:00:06 crc kubenswrapper[4658]: I1002 13:00:06.497728 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_44a349ce-b770-4e0a-bc23-afb9bdea6eba/ovsdbserver-sb/0.log" Oct 02 13:00:06 crc kubenswrapper[4658]: I1002 13:00:06.840553 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-574d544bd8-7g449_c77ff071-5d94-49df-a4b3-25c8dd727b6e/placement-api/0.log" Oct 02 13:00:06 crc kubenswrapper[4658]: I1002 13:00:06.994562 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-574d544bd8-7g449_c77ff071-5d94-49df-a4b3-25c8dd727b6e/placement-log/0.log" Oct 02 13:00:07 crc kubenswrapper[4658]: I1002 13:00:07.026529 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/init-config-reloader/0.log" Oct 02 13:00:07 crc kubenswrapper[4658]: I1002 13:00:07.242881 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/init-config-reloader/0.log" Oct 02 13:00:07 crc kubenswrapper[4658]: I1002 13:00:07.263045 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/config-reloader/0.log" Oct 02 13:00:07 crc kubenswrapper[4658]: I1002 13:00:07.281534 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/prometheus/0.log" Oct 02 13:00:07 crc kubenswrapper[4658]: I1002 13:00:07.429061 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b8e966f-7f02-41e2-8022-99deb47a8c93/thanos-sidecar/0.log" Oct 02 13:00:07 crc kubenswrapper[4658]: I1002 13:00:07.560974 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c6406a7e-4303-43ed-bb07-2816e29af04c/setup-container/0.log" Oct 02 13:00:07 crc kubenswrapper[4658]: I1002 13:00:07.712430 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c6406a7e-4303-43ed-bb07-2816e29af04c/setup-container/0.log" Oct 02 13:00:07 crc kubenswrapper[4658]: I1002 13:00:07.755259 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c6406a7e-4303-43ed-bb07-2816e29af04c/rabbitmq/0.log" Oct 02 13:00:07 crc kubenswrapper[4658]: I1002 13:00:07.929225 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a129e57-376b-4bc6-8d0c-c667d692d487/setup-container/0.log" Oct 02 13:00:08 crc kubenswrapper[4658]: I1002 13:00:08.099612 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a129e57-376b-4bc6-8d0c-c667d692d487/rabbitmq/0.log" Oct 02 13:00:08 crc kubenswrapper[4658]: I1002 13:00:08.190638 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a129e57-376b-4bc6-8d0c-c667d692d487/setup-container/0.log" Oct 02 13:00:08 crc kubenswrapper[4658]: I1002 13:00:08.396176 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pwjf8_f8637fd5-d51c-4da2-a043-98c8f655f10f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:08 crc kubenswrapper[4658]: I1002 13:00:08.520526 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-kwgw7_270f59c2-b21f-4b38-821c-5c1b4ce0be21/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:08 crc kubenswrapper[4658]: I1002 13:00:08.786717 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n28mb_4dbacd18-944b-4b5f-be12-5ac2c1cb163a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:08 crc kubenswrapper[4658]: I1002 13:00:08.854111 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bztkh_26a7e52d-c3b7-4a7d-ae46-c2f32adb479a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:09 crc kubenswrapper[4658]: I1002 13:00:09.005545 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dz25k_5a2e4e7a-11ed-4e29-b2f3-28919813fa63/ssh-known-hosts-edpm-deployment/0.log" Oct 02 13:00:09 crc kubenswrapper[4658]: I1002 13:00:09.233281 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5566488b4c-k88mg_67435e65-47df-41df-9570-df74c35bd5fc/proxy-server/0.log" Oct 02 13:00:09 crc kubenswrapper[4658]: I1002 13:00:09.367022 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5566488b4c-k88mg_67435e65-47df-41df-9570-df74c35bd5fc/proxy-httpd/0.log" Oct 02 13:00:09 crc kubenswrapper[4658]: I1002 13:00:09.438440 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-fkzqr_0909c66f-f3c6-440c-add2-8784d1c209c7/swift-ring-rebalance/0.log" Oct 02 13:00:09 crc kubenswrapper[4658]: I1002 13:00:09.641506 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/account-reaper/0.log" Oct 02 13:00:09 crc kubenswrapper[4658]: I1002 13:00:09.657710 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/account-auditor/0.log" Oct 02 13:00:09 crc kubenswrapper[4658]: I1002 13:00:09.829149 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/account-server/0.log" Oct 02 13:00:09 crc kubenswrapper[4658]: I1002 13:00:09.855325 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/container-auditor/0.log" Oct 02 13:00:09 crc kubenswrapper[4658]: I1002 13:00:09.869309 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/account-replicator/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.069548 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/container-replicator/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.073091 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/container-server/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.083691 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/container-updater/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.274938 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-expirer/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.306139 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-auditor/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.306770 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-replicator/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.444319 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-server/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.514271 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/rsync/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.583998 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/object-updater/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.642522 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6d0e9bcc-e466-4017-92b9-d12e55fc7953/swift-recon-cron/0.log" Oct 02 13:00:10 crc kubenswrapper[4658]: I1002 13:00:10.900711 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-8rgbp_7d923299-fe7c-4ece-8f48-7c95a141f4c8/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:11 crc kubenswrapper[4658]: I1002 13:00:11.035989 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_fd9ceedd-f5a7-425a-9112-998edc1d3e00/tempest-tests-tempest-tests-runner/0.log" Oct 02 13:00:11 crc kubenswrapper[4658]: I1002 13:00:11.115956 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9fb066d3-ce67-4635-bebd-2e24da16a2a8/test-operator-logs-container/0.log" Oct 02 13:00:11 crc kubenswrapper[4658]: I1002 13:00:11.332808 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9m2k5_bfda0e17-a4e9-4a4f-9678-418901ed432a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 13:00:12 crc kubenswrapper[4658]: I1002 13:00:12.331424 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_dba2292e-4150-4a9d-9b22-49482e381c6c/watcher-applier/0.log" Oct 02 13:00:12 crc kubenswrapper[4658]: I1002 13:00:12.561200 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_a963ca85-eeb4-4678-849f-b5b980b36091/watcher-api-log/0.log" Oct 02 13:00:14 crc kubenswrapper[4658]: I1002 13:00:14.050986 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_34ba94d4-e1db-40a9-93e7-5a4e053ae8db/watcher-decision-engine/0.log" Oct 02 13:00:16 crc kubenswrapper[4658]: I1002 13:00:16.404985 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_a963ca85-eeb4-4678-849f-b5b980b36091/watcher-api/0.log" Oct 02 13:00:18 crc kubenswrapper[4658]: I1002 13:00:18.406229 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3f3cc404-a92f-4ef8-a799-83eb314e4382/memcached/0.log" Oct 02 13:00:25 crc kubenswrapper[4658]: I1002 13:00:25.981983 4658 scope.go:117] "RemoveContainer" containerID="ef08a02f063a1e7f4dc3284d3fa4c987a6164000a93a30cc5f6ec86006e38b01" Oct 02 13:00:26 crc kubenswrapper[4658]: I1002 13:00:26.014107 4658 scope.go:117] "RemoveContainer" containerID="08abc25f9255ab4102f90a64dd5c1053ea21aa34f17ca33252f0d3ea5307b5d5" Oct 02 13:00:41 crc kubenswrapper[4658]: I1002 13:00:41.832474 4658 generic.go:334] "Generic (PLEG): container finished" podID="15102a44-06a6-4ecb-b7f0-f094d14e6c3c" containerID="906e94a8ae08eabf4cebdbb254079d0d22d6ef6beec0f6f90580006348d3ee4b" exitCode=0 Oct 02 13:00:41 crc kubenswrapper[4658]: I1002 13:00:41.832536 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" event={"ID":"15102a44-06a6-4ecb-b7f0-f094d14e6c3c","Type":"ContainerDied","Data":"906e94a8ae08eabf4cebdbb254079d0d22d6ef6beec0f6f90580006348d3ee4b"} Oct 02 13:00:42 crc kubenswrapper[4658]: I1002 13:00:42.946447 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.011441 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5fsxr/crc-debug-5kv9d"] Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.029413 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5fsxr/crc-debug-5kv9d"] Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.085211 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm9c9\" (UniqueName: \"kubernetes.io/projected/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-kube-api-access-gm9c9\") pod \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\" (UID: \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\") " Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.085319 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-host\") pod \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\" (UID: \"15102a44-06a6-4ecb-b7f0-f094d14e6c3c\") " Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.085493 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-host" (OuterVolumeSpecName: "host") pod "15102a44-06a6-4ecb-b7f0-f094d14e6c3c" (UID: "15102a44-06a6-4ecb-b7f0-f094d14e6c3c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.085999 4658 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-host\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.090514 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-kube-api-access-gm9c9" (OuterVolumeSpecName: "kube-api-access-gm9c9") pod "15102a44-06a6-4ecb-b7f0-f094d14e6c3c" (UID: "15102a44-06a6-4ecb-b7f0-f094d14e6c3c"). InnerVolumeSpecName "kube-api-access-gm9c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.187619 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm9c9\" (UniqueName: \"kubernetes.io/projected/15102a44-06a6-4ecb-b7f0-f094d14e6c3c-kube-api-access-gm9c9\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.857056 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d24c0d9b80c80536c4bd87bffa0fa62b41ac7c3c9201afd64fb6c4c45e23a6ce" Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.857124 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-5kv9d" Oct 02 13:00:43 crc kubenswrapper[4658]: I1002 13:00:43.970102 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15102a44-06a6-4ecb-b7f0-f094d14e6c3c" path="/var/lib/kubelet/pods/15102a44-06a6-4ecb-b7f0-f094d14e6c3c/volumes" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.175286 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5fsxr/crc-debug-ppk29"] Oct 02 13:00:44 crc kubenswrapper[4658]: E1002 13:00:44.175686 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15102a44-06a6-4ecb-b7f0-f094d14e6c3c" containerName="container-00" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.175698 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="15102a44-06a6-4ecb-b7f0-f094d14e6c3c" containerName="container-00" Oct 02 13:00:44 crc kubenswrapper[4658]: E1002 13:00:44.175712 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae6f190-a3ff-4558-be4a-cdab4b592e9b" containerName="collect-profiles" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.175718 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae6f190-a3ff-4558-be4a-cdab4b592e9b" containerName="collect-profiles" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.175927 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae6f190-a3ff-4558-be4a-cdab4b592e9b" containerName="collect-profiles" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.175942 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="15102a44-06a6-4ecb-b7f0-f094d14e6c3c" containerName="container-00" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.176670 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.178500 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5fsxr"/"default-dockercfg-q9qdm" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.309260 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0631172c-7bc1-4af5-b3f9-5718147fdbe6-host\") pod \"crc-debug-ppk29\" (UID: \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\") " pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.309889 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb4k6\" (UniqueName: \"kubernetes.io/projected/0631172c-7bc1-4af5-b3f9-5718147fdbe6-kube-api-access-mb4k6\") pod \"crc-debug-ppk29\" (UID: \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\") " pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.411360 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb4k6\" (UniqueName: \"kubernetes.io/projected/0631172c-7bc1-4af5-b3f9-5718147fdbe6-kube-api-access-mb4k6\") pod \"crc-debug-ppk29\" (UID: \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\") " pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.411476 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0631172c-7bc1-4af5-b3f9-5718147fdbe6-host\") pod \"crc-debug-ppk29\" (UID: \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\") " pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.411609 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0631172c-7bc1-4af5-b3f9-5718147fdbe6-host\") pod \"crc-debug-ppk29\" (UID: \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\") " pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.443927 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb4k6\" (UniqueName: \"kubernetes.io/projected/0631172c-7bc1-4af5-b3f9-5718147fdbe6-kube-api-access-mb4k6\") pod \"crc-debug-ppk29\" (UID: \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\") " pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.498951 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.870476 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/crc-debug-ppk29" event={"ID":"0631172c-7bc1-4af5-b3f9-5718147fdbe6","Type":"ContainerStarted","Data":"c049badb01f7f24a21a6822280a4d9aac3adcb2174969db14d130917b80dad91"} Oct 02 13:00:44 crc kubenswrapper[4658]: I1002 13:00:44.870918 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/crc-debug-ppk29" event={"ID":"0631172c-7bc1-4af5-b3f9-5718147fdbe6","Type":"ContainerStarted","Data":"7c9719c1f70966b6deb43ee4966635011676aeb25904851239eb5fc80ceada24"} Oct 02 13:00:45 crc kubenswrapper[4658]: I1002 13:00:45.881659 4658 generic.go:334] "Generic (PLEG): container finished" podID="0631172c-7bc1-4af5-b3f9-5718147fdbe6" containerID="c049badb01f7f24a21a6822280a4d9aac3adcb2174969db14d130917b80dad91" exitCode=0 Oct 02 13:00:45 crc kubenswrapper[4658]: I1002 13:00:45.882026 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/crc-debug-ppk29" event={"ID":"0631172c-7bc1-4af5-b3f9-5718147fdbe6","Type":"ContainerDied","Data":"c049badb01f7f24a21a6822280a4d9aac3adcb2174969db14d130917b80dad91"} Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.004089 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.154923 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb4k6\" (UniqueName: \"kubernetes.io/projected/0631172c-7bc1-4af5-b3f9-5718147fdbe6-kube-api-access-mb4k6\") pod \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\" (UID: \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\") " Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.155452 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0631172c-7bc1-4af5-b3f9-5718147fdbe6-host\") pod \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\" (UID: \"0631172c-7bc1-4af5-b3f9-5718147fdbe6\") " Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.155546 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0631172c-7bc1-4af5-b3f9-5718147fdbe6-host" (OuterVolumeSpecName: "host") pod "0631172c-7bc1-4af5-b3f9-5718147fdbe6" (UID: "0631172c-7bc1-4af5-b3f9-5718147fdbe6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.156094 4658 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0631172c-7bc1-4af5-b3f9-5718147fdbe6-host\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.165250 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0631172c-7bc1-4af5-b3f9-5718147fdbe6-kube-api-access-mb4k6" (OuterVolumeSpecName: "kube-api-access-mb4k6") pod "0631172c-7bc1-4af5-b3f9-5718147fdbe6" (UID: "0631172c-7bc1-4af5-b3f9-5718147fdbe6"). InnerVolumeSpecName "kube-api-access-mb4k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.257367 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb4k6\" (UniqueName: \"kubernetes.io/projected/0631172c-7bc1-4af5-b3f9-5718147fdbe6-kube-api-access-mb4k6\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.900418 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/crc-debug-ppk29" event={"ID":"0631172c-7bc1-4af5-b3f9-5718147fdbe6","Type":"ContainerDied","Data":"7c9719c1f70966b6deb43ee4966635011676aeb25904851239eb5fc80ceada24"} Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.900463 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c9719c1f70966b6deb43ee4966635011676aeb25904851239eb5fc80ceada24" Oct 02 13:00:47 crc kubenswrapper[4658]: I1002 13:00:47.900471 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-ppk29" Oct 02 13:00:54 crc kubenswrapper[4658]: I1002 13:00:54.210737 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5fsxr/crc-debug-ppk29"] Oct 02 13:00:54 crc kubenswrapper[4658]: I1002 13:00:54.219269 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5fsxr/crc-debug-ppk29"] Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.374341 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5fsxr/crc-debug-lzms9"] Oct 02 13:00:55 crc kubenswrapper[4658]: E1002 13:00:55.375030 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0631172c-7bc1-4af5-b3f9-5718147fdbe6" containerName="container-00" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.375042 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="0631172c-7bc1-4af5-b3f9-5718147fdbe6" containerName="container-00" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.375254 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="0631172c-7bc1-4af5-b3f9-5718147fdbe6" containerName="container-00" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.375956 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.378377 4658 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5fsxr"/"default-dockercfg-q9qdm" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.488415 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-host\") pod \"crc-debug-lzms9\" (UID: \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\") " pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.488500 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zwj\" (UniqueName: \"kubernetes.io/projected/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-kube-api-access-n5zwj\") pod \"crc-debug-lzms9\" (UID: \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\") " pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.590957 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-host\") pod \"crc-debug-lzms9\" (UID: \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\") " pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.591041 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zwj\" (UniqueName: \"kubernetes.io/projected/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-kube-api-access-n5zwj\") pod \"crc-debug-lzms9\" (UID: \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\") " pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.591109 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-host\") pod \"crc-debug-lzms9\" (UID: \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\") " pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.613006 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zwj\" (UniqueName: \"kubernetes.io/projected/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-kube-api-access-n5zwj\") pod \"crc-debug-lzms9\" (UID: \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\") " pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.699212 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.984753 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0631172c-7bc1-4af5-b3f9-5718147fdbe6" path="/var/lib/kubelet/pods/0631172c-7bc1-4af5-b3f9-5718147fdbe6/volumes" Oct 02 13:00:55 crc kubenswrapper[4658]: I1002 13:00:55.988104 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/crc-debug-lzms9" event={"ID":"f663f5fb-f9f4-41ec-99b4-a3b491aadd97","Type":"ContainerStarted","Data":"c86cba328ecf27b420bba3ef2efaec8c5cf1aa4940ee50cb8bd6dfdd15f4c12e"} Oct 02 13:00:57 crc kubenswrapper[4658]: I1002 13:00:57.003861 4658 generic.go:334] "Generic (PLEG): container finished" podID="f663f5fb-f9f4-41ec-99b4-a3b491aadd97" containerID="13a2fb7e775cbd6f5fa90c2dd0b36fcbfc19b3d3fc8abf1192badd9b08eb3108" exitCode=0 Oct 02 13:00:57 crc kubenswrapper[4658]: I1002 13:00:57.003968 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/crc-debug-lzms9" event={"ID":"f663f5fb-f9f4-41ec-99b4-a3b491aadd97","Type":"ContainerDied","Data":"13a2fb7e775cbd6f5fa90c2dd0b36fcbfc19b3d3fc8abf1192badd9b08eb3108"} Oct 02 13:00:57 crc kubenswrapper[4658]: I1002 13:00:57.042437 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5fsxr/crc-debug-lzms9"] Oct 02 13:00:57 crc kubenswrapper[4658]: I1002 13:00:57.051178 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5fsxr/crc-debug-lzms9"] Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.139991 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.252863 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-host\") pod \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\" (UID: \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\") " Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.252909 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5zwj\" (UniqueName: \"kubernetes.io/projected/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-kube-api-access-n5zwj\") pod \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\" (UID: \"f663f5fb-f9f4-41ec-99b4-a3b491aadd97\") " Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.253225 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-host" (OuterVolumeSpecName: "host") pod "f663f5fb-f9f4-41ec-99b4-a3b491aadd97" (UID: "f663f5fb-f9f4-41ec-99b4-a3b491aadd97"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.254036 4658 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-host\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.258639 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-kube-api-access-n5zwj" (OuterVolumeSpecName: "kube-api-access-n5zwj") pod "f663f5fb-f9f4-41ec-99b4-a3b491aadd97" (UID: "f663f5fb-f9f4-41ec-99b4-a3b491aadd97"). InnerVolumeSpecName "kube-api-access-n5zwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.356125 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5zwj\" (UniqueName: \"kubernetes.io/projected/f663f5fb-f9f4-41ec-99b4-a3b491aadd97-kube-api-access-n5zwj\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.633500 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/util/0.log" Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.829982 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/pull/0.log" Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.849598 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/pull/0.log" Oct 02 13:00:58 crc kubenswrapper[4658]: I1002 13:00:58.921841 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/util/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.022487 4658 scope.go:117] "RemoveContainer" containerID="13a2fb7e775cbd6f5fa90c2dd0b36fcbfc19b3d3fc8abf1192badd9b08eb3108" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.022529 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/crc-debug-lzms9" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.046725 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/util/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.047166 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/pull/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.055170 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_051d61b069b17a0ad6b29bad71737f6f2da74549134e42a6ab6f982aa66th6m_8a703ab4-d1c1-417b-8f0b-7530ed09a26a/extract/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.187948 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-kkldn_3f426838-95ca-4579-9745-e78f0ccab683/kube-rbac-proxy/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.306719 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-kkldn_3f426838-95ca-4579-9745-e78f0ccab683/manager/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.324917 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-gckv9_7744dcc1-5c52-4447-8123-53e4c98250fd/kube-rbac-proxy/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.421312 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-gckv9_7744dcc1-5c52-4447-8123-53e4c98250fd/manager/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.507269 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-fgm4w_af944184-d59a-467d-983e-c66fb79823c6/kube-rbac-proxy/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.519716 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-fgm4w_af944184-d59a-467d-983e-c66fb79823c6/manager/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.652403 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-8ttj2_70026a4a-6db4-4777-afed-a5ea3de1fc60/kube-rbac-proxy/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.763072 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-8ttj2_70026a4a-6db4-4777-afed-a5ea3de1fc60/manager/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.804157 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-66q5b_6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c/kube-rbac-proxy/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.846582 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-66q5b_6e248b8c-b6bb-42e2-b6ac-c8a97b5d068c/manager/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.961039 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f663f5fb-f9f4-41ec-99b4-a3b491aadd97" path="/var/lib/kubelet/pods/f663f5fb-f9f4-41ec-99b4-a3b491aadd97/volumes" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.974400 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-ljz2h_d480d1a6-c309-454f-8e99-a762feed8490/kube-rbac-proxy/0.log" Oct 02 13:00:59 crc kubenswrapper[4658]: I1002 13:00:59.999837 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-ljz2h_d480d1a6-c309-454f-8e99-a762feed8490/manager/0.log" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.138847 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-kznvq_f527a8e5-d051-4017-80e4-e3b2f1fd59ba/kube-rbac-proxy/0.log" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.160165 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29323501-dszvk"] Oct 02 13:01:00 crc kubenswrapper[4658]: E1002 13:01:00.160661 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f663f5fb-f9f4-41ec-99b4-a3b491aadd97" containerName="container-00" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.160681 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="f663f5fb-f9f4-41ec-99b4-a3b491aadd97" containerName="container-00" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.160889 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="f663f5fb-f9f4-41ec-99b4-a3b491aadd97" containerName="container-00" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.161619 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.193399 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-combined-ca-bundle\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.193695 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-fernet-keys\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.193775 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-config-data\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.193816 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6gn\" (UniqueName: \"kubernetes.io/projected/2d752f5b-a53c-492a-9b01-ae76a861153e-kube-api-access-sm6gn\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.193986 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323501-dszvk"] Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.295155 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-fernet-keys\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.295633 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-config-data\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.295739 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6gn\" (UniqueName: \"kubernetes.io/projected/2d752f5b-a53c-492a-9b01-ae76a861153e-kube-api-access-sm6gn\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.295892 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-combined-ca-bundle\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.301882 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-combined-ca-bundle\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.303359 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-config-data\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.306905 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-fernet-keys\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.322199 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6gn\" (UniqueName: \"kubernetes.io/projected/2d752f5b-a53c-492a-9b01-ae76a861153e-kube-api-access-sm6gn\") pod \"keystone-cron-29323501-dszvk\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.331250 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-7mfsk_bf9ac0a3-4903-4115-9793-b6bd913d4e0a/kube-rbac-proxy/0.log" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.380244 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-kznvq_f527a8e5-d051-4017-80e4-e3b2f1fd59ba/manager/0.log" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.416827 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-7mfsk_bf9ac0a3-4903-4115-9793-b6bd913d4e0a/manager/0.log" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.500209 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.562178 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-l62bl_d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b/kube-rbac-proxy/0.log" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.688037 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-l62bl_d9400643-d8ff-4e59-aa6d-e1d3d9eeef1b/manager/0.log" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.865051 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-tnfxq_55b04e2c-c701-4f74-9fb6-1dce9d2de108/kube-rbac-proxy/0.log" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.889914 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-tnfxq_55b04e2c-c701-4f74-9fb6-1dce9d2de108/manager/0.log" Oct 02 13:01:00 crc kubenswrapper[4658]: I1002 13:01:00.954354 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-g8dwz_9787421c-8d35-4d30-8946-90bc71eba9c0/kube-rbac-proxy/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.008775 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323501-dszvk"] Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.049189 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-dszvk" event={"ID":"2d752f5b-a53c-492a-9b01-ae76a861153e","Type":"ContainerStarted","Data":"bfef28d771b0132e52a3fe9a713e4137f6ae53665edd8c507acdefa2e8ca6cc0"} Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.075024 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-g8dwz_9787421c-8d35-4d30-8946-90bc71eba9c0/manager/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.154850 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-fsnf7_6a460926-8982-40c1-b177-3620aa3dcb79/kube-rbac-proxy/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.230982 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-fsnf7_6a460926-8982-40c1-b177-3620aa3dcb79/manager/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.357447 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-htz9g_830f6e33-ad1f-4033-a725-9f10415996e7/kube-rbac-proxy/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.437949 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-htz9g_830f6e33-ad1f-4033-a725-9f10415996e7/manager/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.517019 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-kbj6t_5aeb03f1-db88-497b-b3cb-11e01e2a7b31/kube-rbac-proxy/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.573804 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-kbj6t_5aeb03f1-db88-497b-b3cb-11e01e2a7b31/manager/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.711432 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-ffhdh_afbaa143-b11e-406d-b797-6ba114fbf9a4/manager/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.725692 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-ffhdh_afbaa143-b11e-406d-b797-6ba114fbf9a4/kube-rbac-proxy/0.log" Oct 02 13:01:01 crc kubenswrapper[4658]: I1002 13:01:01.845703 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f6b64f7bf-8c66j_903dfdb7-34f3-4875-8009-482cb7d5469b/kube-rbac-proxy/0.log" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.060019 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6f47f5dc76-j82tf_943e808f-860b-4f8a-a933-84f0dd0cddc5/kube-rbac-proxy/0.log" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.061137 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-dszvk" event={"ID":"2d752f5b-a53c-492a-9b01-ae76a861153e","Type":"ContainerStarted","Data":"cfaa2d1d8f8b88ffc8c936ee64f911a49b8c73871e2c48cfaf4c8a860501d462"} Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.076147 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29323501-dszvk" podStartSLOduration=2.076127547 podStartE2EDuration="2.076127547s" podCreationTimestamp="2025-10-02 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:02.073157102 +0000 UTC m=+6142.964310669" watchObservedRunningTime="2025-10-02 13:01:02.076127547 +0000 UTC m=+6142.967281114" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.286162 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6f47f5dc76-j82tf_943e808f-860b-4f8a-a933-84f0dd0cddc5/operator/0.log" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.322743 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4jglm_71959757-609a-415a-9717-711c3f8ad66d/registry-server/0.log" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.406817 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-mhrcv_7b2e2130-4b00-4242-8254-c8be160bfe89/kube-rbac-proxy/0.log" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.561105 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-wqqdv_c802dbff-c65f-40e9-91ee-3ea6f0aee6a2/kube-rbac-proxy/0.log" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.579636 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-mhrcv_7b2e2130-4b00-4242-8254-c8be160bfe89/manager/0.log" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.643428 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-wqqdv_c802dbff-c65f-40e9-91ee-3ea6f0aee6a2/manager/0.log" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.853598 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-xw82t_75df76ba-0998-4b89-887e-d8f0b1c546b4/operator/0.log" Oct 02 13:01:02 crc kubenswrapper[4658]: I1002 13:01:02.946864 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-ppg68_c92dcd56-734e-430c-813e-1405ab2e141b/kube-rbac-proxy/0.log" Oct 02 13:01:03 crc kubenswrapper[4658]: I1002 13:01:03.069815 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-ppg68_c92dcd56-734e-430c-813e-1405ab2e141b/manager/0.log" Oct 02 13:01:03 crc kubenswrapper[4658]: I1002 13:01:03.117066 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-49k5r_33b8c756-1330-4114-bf78-2b3835667a1e/kube-rbac-proxy/0.log" Oct 02 13:01:03 crc kubenswrapper[4658]: I1002 13:01:03.205224 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f6b64f7bf-8c66j_903dfdb7-34f3-4875-8009-482cb7d5469b/manager/0.log" Oct 02 13:01:03 crc kubenswrapper[4658]: I1002 13:01:03.308786 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-4bhqs_3dba06c0-4986-438c-a553-76b0bcddd74c/kube-rbac-proxy/0.log" Oct 02 13:01:03 crc kubenswrapper[4658]: I1002 13:01:03.364278 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-4bhqs_3dba06c0-4986-438c-a553-76b0bcddd74c/manager/0.log" Oct 02 13:01:03 crc kubenswrapper[4658]: I1002 13:01:03.434751 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-49k5r_33b8c756-1330-4114-bf78-2b3835667a1e/manager/0.log" Oct 02 13:01:03 crc kubenswrapper[4658]: I1002 13:01:03.534331 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7fc7d86889-mqpv9_e9eb741d-265d-4f59-ab6e-c6a42f720801/kube-rbac-proxy/0.log" Oct 02 13:01:03 crc kubenswrapper[4658]: I1002 13:01:03.600617 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7fc7d86889-mqpv9_e9eb741d-265d-4f59-ab6e-c6a42f720801/manager/0.log" Oct 02 13:01:05 crc kubenswrapper[4658]: I1002 13:01:05.094409 4658 generic.go:334] "Generic (PLEG): container finished" podID="2d752f5b-a53c-492a-9b01-ae76a861153e" containerID="cfaa2d1d8f8b88ffc8c936ee64f911a49b8c73871e2c48cfaf4c8a860501d462" exitCode=0 Oct 02 13:01:05 crc kubenswrapper[4658]: I1002 13:01:05.094518 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-dszvk" event={"ID":"2d752f5b-a53c-492a-9b01-ae76a861153e","Type":"ContainerDied","Data":"cfaa2d1d8f8b88ffc8c936ee64f911a49b8c73871e2c48cfaf4c8a860501d462"} Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.447363 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.514070 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-config-data\") pod \"2d752f5b-a53c-492a-9b01-ae76a861153e\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.514129 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-fernet-keys\") pod \"2d752f5b-a53c-492a-9b01-ae76a861153e\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.514175 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm6gn\" (UniqueName: \"kubernetes.io/projected/2d752f5b-a53c-492a-9b01-ae76a861153e-kube-api-access-sm6gn\") pod \"2d752f5b-a53c-492a-9b01-ae76a861153e\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.514312 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-combined-ca-bundle\") pod \"2d752f5b-a53c-492a-9b01-ae76a861153e\" (UID: \"2d752f5b-a53c-492a-9b01-ae76a861153e\") " Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.523423 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2d752f5b-a53c-492a-9b01-ae76a861153e" (UID: "2d752f5b-a53c-492a-9b01-ae76a861153e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.543805 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d752f5b-a53c-492a-9b01-ae76a861153e-kube-api-access-sm6gn" (OuterVolumeSpecName: "kube-api-access-sm6gn") pod "2d752f5b-a53c-492a-9b01-ae76a861153e" (UID: "2d752f5b-a53c-492a-9b01-ae76a861153e"). InnerVolumeSpecName "kube-api-access-sm6gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.598944 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-config-data" (OuterVolumeSpecName: "config-data") pod "2d752f5b-a53c-492a-9b01-ae76a861153e" (UID: "2d752f5b-a53c-492a-9b01-ae76a861153e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.599539 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d752f5b-a53c-492a-9b01-ae76a861153e" (UID: "2d752f5b-a53c-492a-9b01-ae76a861153e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.618711 4658 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.618762 4658 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.618775 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm6gn\" (UniqueName: \"kubernetes.io/projected/2d752f5b-a53c-492a-9b01-ae76a861153e-kube-api-access-sm6gn\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:06 crc kubenswrapper[4658]: I1002 13:01:06.618788 4658 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d752f5b-a53c-492a-9b01-ae76a861153e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:07 crc kubenswrapper[4658]: I1002 13:01:07.117040 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-dszvk" event={"ID":"2d752f5b-a53c-492a-9b01-ae76a861153e","Type":"ContainerDied","Data":"bfef28d771b0132e52a3fe9a713e4137f6ae53665edd8c507acdefa2e8ca6cc0"} Oct 02 13:01:07 crc kubenswrapper[4658]: I1002 13:01:07.117104 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-dszvk" Oct 02 13:01:07 crc kubenswrapper[4658]: I1002 13:01:07.117109 4658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfef28d771b0132e52a3fe9a713e4137f6ae53665edd8c507acdefa2e8ca6cc0" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.027144 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f6gx2"] Oct 02 13:01:18 crc kubenswrapper[4658]: E1002 13:01:18.028079 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d752f5b-a53c-492a-9b01-ae76a861153e" containerName="keystone-cron" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.028092 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d752f5b-a53c-492a-9b01-ae76a861153e" containerName="keystone-cron" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.028280 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d752f5b-a53c-492a-9b01-ae76a861153e" containerName="keystone-cron" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.029824 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.047607 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f6gx2"] Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.136819 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgx9b\" (UniqueName: \"kubernetes.io/projected/a543aaef-542e-4081-866c-c1393e20ea9e-kube-api-access-wgx9b\") pod \"redhat-operators-f6gx2\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.137028 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-utilities\") pod \"redhat-operators-f6gx2\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.137059 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-catalog-content\") pod \"redhat-operators-f6gx2\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.238606 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgx9b\" (UniqueName: \"kubernetes.io/projected/a543aaef-542e-4081-866c-c1393e20ea9e-kube-api-access-wgx9b\") pod \"redhat-operators-f6gx2\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.238735 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-utilities\") pod \"redhat-operators-f6gx2\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.238774 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-catalog-content\") pod \"redhat-operators-f6gx2\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.239477 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-catalog-content\") pod \"redhat-operators-f6gx2\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.239450 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-utilities\") pod \"redhat-operators-f6gx2\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.272684 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgx9b\" (UniqueName: \"kubernetes.io/projected/a543aaef-542e-4081-866c-c1393e20ea9e-kube-api-access-wgx9b\") pod \"redhat-operators-f6gx2\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.354031 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:18 crc kubenswrapper[4658]: I1002 13:01:18.885995 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f6gx2"] Oct 02 13:01:18 crc kubenswrapper[4658]: W1002 13:01:18.901424 4658 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda543aaef_542e_4081_866c_c1393e20ea9e.slice/crio-3349bfcd229f16bb5aa406065d09f201436bff3e470ef418b7b292290a1f79cd WatchSource:0}: Error finding container 3349bfcd229f16bb5aa406065d09f201436bff3e470ef418b7b292290a1f79cd: Status 404 returned error can't find the container with id 3349bfcd229f16bb5aa406065d09f201436bff3e470ef418b7b292290a1f79cd Oct 02 13:01:19 crc kubenswrapper[4658]: I1002 13:01:19.205658 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4wktl_2b161a36-8654-4948-8412-bb68940fe512/control-plane-machine-set-operator/0.log" Oct 02 13:01:19 crc kubenswrapper[4658]: I1002 13:01:19.239956 4658 generic.go:334] "Generic (PLEG): container finished" podID="a543aaef-542e-4081-866c-c1393e20ea9e" containerID="e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a" exitCode=0 Oct 02 13:01:19 crc kubenswrapper[4658]: I1002 13:01:19.240013 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6gx2" event={"ID":"a543aaef-542e-4081-866c-c1393e20ea9e","Type":"ContainerDied","Data":"e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a"} Oct 02 13:01:19 crc kubenswrapper[4658]: I1002 13:01:19.240067 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6gx2" event={"ID":"a543aaef-542e-4081-866c-c1393e20ea9e","Type":"ContainerStarted","Data":"3349bfcd229f16bb5aa406065d09f201436bff3e470ef418b7b292290a1f79cd"} Oct 02 13:01:19 crc kubenswrapper[4658]: I1002 13:01:19.574978 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gjt96_bfa1953c-4c82-4463-b772-6b871bcea9b8/kube-rbac-proxy/0.log" Oct 02 13:01:19 crc kubenswrapper[4658]: I1002 13:01:19.745390 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gjt96_bfa1953c-4c82-4463-b772-6b871bcea9b8/machine-api-operator/0.log" Oct 02 13:01:21 crc kubenswrapper[4658]: I1002 13:01:21.259693 4658 generic.go:334] "Generic (PLEG): container finished" podID="a543aaef-542e-4081-866c-c1393e20ea9e" containerID="bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0" exitCode=0 Oct 02 13:01:21 crc kubenswrapper[4658]: I1002 13:01:21.259788 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6gx2" event={"ID":"a543aaef-542e-4081-866c-c1393e20ea9e","Type":"ContainerDied","Data":"bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0"} Oct 02 13:01:22 crc kubenswrapper[4658]: I1002 13:01:22.272511 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6gx2" event={"ID":"a543aaef-542e-4081-866c-c1393e20ea9e","Type":"ContainerStarted","Data":"b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121"} Oct 02 13:01:22 crc kubenswrapper[4658]: I1002 13:01:22.293657 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f6gx2" podStartSLOduration=1.713533649 podStartE2EDuration="4.293636349s" podCreationTimestamp="2025-10-02 13:01:18 +0000 UTC" firstStartedPulling="2025-10-02 13:01:19.241738905 +0000 UTC m=+6160.132892472" lastFinishedPulling="2025-10-02 13:01:21.821841605 +0000 UTC m=+6162.712995172" observedRunningTime="2025-10-02 13:01:22.289519468 +0000 UTC m=+6163.180673035" watchObservedRunningTime="2025-10-02 13:01:22.293636349 +0000 UTC m=+6163.184789916" Oct 02 13:01:28 crc kubenswrapper[4658]: I1002 13:01:28.355171 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:28 crc kubenswrapper[4658]: I1002 13:01:28.355785 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:28 crc kubenswrapper[4658]: I1002 13:01:28.415457 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:29 crc kubenswrapper[4658]: I1002 13:01:29.381226 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:29 crc kubenswrapper[4658]: I1002 13:01:29.442440 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f6gx2"] Oct 02 13:01:31 crc kubenswrapper[4658]: I1002 13:01:31.346476 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f6gx2" podUID="a543aaef-542e-4081-866c-c1393e20ea9e" containerName="registry-server" containerID="cri-o://b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121" gracePeriod=2 Oct 02 13:01:31 crc kubenswrapper[4658]: I1002 13:01:31.890926 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:31 crc kubenswrapper[4658]: I1002 13:01:31.919836 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgx9b\" (UniqueName: \"kubernetes.io/projected/a543aaef-542e-4081-866c-c1393e20ea9e-kube-api-access-wgx9b\") pod \"a543aaef-542e-4081-866c-c1393e20ea9e\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " Oct 02 13:01:31 crc kubenswrapper[4658]: I1002 13:01:31.919968 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-utilities\") pod \"a543aaef-542e-4081-866c-c1393e20ea9e\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " Oct 02 13:01:31 crc kubenswrapper[4658]: I1002 13:01:31.920092 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-catalog-content\") pod \"a543aaef-542e-4081-866c-c1393e20ea9e\" (UID: \"a543aaef-542e-4081-866c-c1393e20ea9e\") " Oct 02 13:01:31 crc kubenswrapper[4658]: I1002 13:01:31.920838 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-utilities" (OuterVolumeSpecName: "utilities") pod "a543aaef-542e-4081-866c-c1393e20ea9e" (UID: "a543aaef-542e-4081-866c-c1393e20ea9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:01:31 crc kubenswrapper[4658]: I1002 13:01:31.927672 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a543aaef-542e-4081-866c-c1393e20ea9e-kube-api-access-wgx9b" (OuterVolumeSpecName: "kube-api-access-wgx9b") pod "a543aaef-542e-4081-866c-c1393e20ea9e" (UID: "a543aaef-542e-4081-866c-c1393e20ea9e"). InnerVolumeSpecName "kube-api-access-wgx9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.001791 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-cn57q_2e28d2d3-12b8-490d-a3f6-6e88c19e4cdf/cert-manager-controller/0.log" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.021524 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgx9b\" (UniqueName: \"kubernetes.io/projected/a543aaef-542e-4081-866c-c1393e20ea9e-kube-api-access-wgx9b\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.021550 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.032678 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a543aaef-542e-4081-866c-c1393e20ea9e" (UID: "a543aaef-542e-4081-866c-c1393e20ea9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.123868 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a543aaef-542e-4081-866c-c1393e20ea9e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.159967 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-4jlqn_329487df-e7b0-4925-8c85-155c96453929/cert-manager-cainjector/0.log" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.235270 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-88pc4_648c22f9-bc82-4a6a-9b68-b9b557f0c243/cert-manager-webhook/0.log" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.360398 4658 generic.go:334] "Generic (PLEG): container finished" podID="a543aaef-542e-4081-866c-c1393e20ea9e" containerID="b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121" exitCode=0 Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.360437 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6gx2" event={"ID":"a543aaef-542e-4081-866c-c1393e20ea9e","Type":"ContainerDied","Data":"b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121"} Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.360472 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6gx2" event={"ID":"a543aaef-542e-4081-866c-c1393e20ea9e","Type":"ContainerDied","Data":"3349bfcd229f16bb5aa406065d09f201436bff3e470ef418b7b292290a1f79cd"} Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.360489 4658 scope.go:117] "RemoveContainer" containerID="b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.360606 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6gx2" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.400903 4658 scope.go:117] "RemoveContainer" containerID="bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.430030 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f6gx2"] Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.432432 4658 scope.go:117] "RemoveContainer" containerID="e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.441279 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f6gx2"] Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.476476 4658 scope.go:117] "RemoveContainer" containerID="b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121" Oct 02 13:01:32 crc kubenswrapper[4658]: E1002 13:01:32.476906 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121\": container with ID starting with b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121 not found: ID does not exist" containerID="b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.476954 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121"} err="failed to get container status \"b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121\": rpc error: code = NotFound desc = could not find container \"b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121\": container with ID starting with b3081d68e998738bfc5027445f66dfd72918d4edfd33d586f6082d3b8dfc0121 not found: ID does not exist" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.476981 4658 scope.go:117] "RemoveContainer" containerID="bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0" Oct 02 13:01:32 crc kubenswrapper[4658]: E1002 13:01:32.477369 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0\": container with ID starting with bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0 not found: ID does not exist" containerID="bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.477422 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0"} err="failed to get container status \"bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0\": rpc error: code = NotFound desc = could not find container \"bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0\": container with ID starting with bb41b45ddc5d1838793927548a4bcec1fb76e7c77ce32de4534eeb07c9ffe7a0 not found: ID does not exist" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.477452 4658 scope.go:117] "RemoveContainer" containerID="e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a" Oct 02 13:01:32 crc kubenswrapper[4658]: E1002 13:01:32.477788 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a\": container with ID starting with e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a not found: ID does not exist" containerID="e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a" Oct 02 13:01:32 crc kubenswrapper[4658]: I1002 13:01:32.477815 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a"} err="failed to get container status \"e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a\": rpc error: code = NotFound desc = could not find container \"e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a\": container with ID starting with e4b8af72667f711ec46afceb15fc43190e75f26c9fa69b19f1b124d4aa9baf1a not found: ID does not exist" Oct 02 13:01:33 crc kubenswrapper[4658]: I1002 13:01:33.964335 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a543aaef-542e-4081-866c-c1393e20ea9e" path="/var/lib/kubelet/pods/a543aaef-542e-4081-866c-c1393e20ea9e/volumes" Oct 02 13:01:43 crc kubenswrapper[4658]: I1002 13:01:43.997695 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-nczcp_53ded798-0460-49d4-8c75-f21907458150/nmstate-console-plugin/0.log" Oct 02 13:01:44 crc kubenswrapper[4658]: I1002 13:01:44.172183 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-rpw4d_3f8f0836-7d23-4df5-8658-79d424122ab3/nmstate-handler/0.log" Oct 02 13:01:44 crc kubenswrapper[4658]: I1002 13:01:44.216479 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-g8smq_485529a7-2da9-40c3-adff-56109c78dbc1/kube-rbac-proxy/0.log" Oct 02 13:01:44 crc kubenswrapper[4658]: I1002 13:01:44.240362 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-g8smq_485529a7-2da9-40c3-adff-56109c78dbc1/nmstate-metrics/0.log" Oct 02 13:01:44 crc kubenswrapper[4658]: I1002 13:01:44.352128 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-zvskn_08cea959-43c4-4ecc-b38d-2960b5d8180c/nmstate-operator/0.log" Oct 02 13:01:44 crc kubenswrapper[4658]: I1002 13:01:44.426877 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-hbfjg_bf546db5-7a99-4338-9c1e-0aecfdf1d7fb/nmstate-webhook/0.log" Oct 02 13:01:57 crc kubenswrapper[4658]: I1002 13:01:57.429855 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:01:57 crc kubenswrapper[4658]: I1002 13:01:57.430672 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:01:57 crc kubenswrapper[4658]: I1002 13:01:57.999033 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bsjg9_9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41/kube-rbac-proxy/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.201781 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-frr-files/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.222429 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bsjg9_9686fc5d-61b7-47a1-b0b0-0bcdd8b31d41/controller/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.386456 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-reloader/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.426586 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-metrics/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.431777 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-frr-files/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.433171 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-reloader/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.608747 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-reloader/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.610821 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-metrics/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.616308 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-frr-files/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.633219 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-metrics/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.763409 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-frr-files/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.819175 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/controller/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.819536 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-reloader/0.log" Oct 02 13:01:58 crc kubenswrapper[4658]: I1002 13:01:58.824018 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/cp-metrics/0.log" Oct 02 13:01:59 crc kubenswrapper[4658]: I1002 13:01:59.007709 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/kube-rbac-proxy-frr/0.log" Oct 02 13:01:59 crc kubenswrapper[4658]: I1002 13:01:59.042275 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/kube-rbac-proxy/0.log" Oct 02 13:01:59 crc kubenswrapper[4658]: I1002 13:01:59.046735 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/frr-metrics/0.log" Oct 02 13:01:59 crc kubenswrapper[4658]: I1002 13:01:59.236962 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/reloader/0.log" Oct 02 13:01:59 crc kubenswrapper[4658]: I1002 13:01:59.341795 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-k4bcd_74c490f6-26be-4b3c-93f7-65b1625425a1/frr-k8s-webhook-server/0.log" Oct 02 13:01:59 crc kubenswrapper[4658]: I1002 13:01:59.474314 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5c6495c478-cxldq_f9c60d31-755b-4e0e-888c-072203581d0d/manager/0.log" Oct 02 13:01:59 crc kubenswrapper[4658]: I1002 13:01:59.683508 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7db6cbc8bb-b4n8z_00a0ddd2-7f0b-4158-a95a-dd16a826ea1e/webhook-server/0.log" Oct 02 13:01:59 crc kubenswrapper[4658]: I1002 13:01:59.799639 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mrv9d_f053e253-c411-41a9-b81b-d7cf91cc9b8b/kube-rbac-proxy/0.log" Oct 02 13:02:00 crc kubenswrapper[4658]: I1002 13:02:00.400656 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mrv9d_f053e253-c411-41a9-b81b-d7cf91cc9b8b/speaker/0.log" Oct 02 13:02:00 crc kubenswrapper[4658]: I1002 13:02:00.682804 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jzr5d_0d17ce7e-0727-401c-b54e-8b6e6729d22a/frr/0.log" Oct 02 13:02:12 crc kubenswrapper[4658]: I1002 13:02:12.204073 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/util/0.log" Oct 02 13:02:12 crc kubenswrapper[4658]: I1002 13:02:12.353066 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/util/0.log" Oct 02 13:02:12 crc kubenswrapper[4658]: I1002 13:02:12.417913 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/pull/0.log" Oct 02 13:02:12 crc kubenswrapper[4658]: I1002 13:02:12.435714 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/pull/0.log" Oct 02 13:02:12 crc kubenswrapper[4658]: I1002 13:02:12.637567 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/util/0.log" Oct 02 13:02:12 crc kubenswrapper[4658]: I1002 13:02:12.649765 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/pull/0.log" Oct 02 13:02:12 crc kubenswrapper[4658]: I1002 13:02:12.653847 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2p2mlb_96b70650-2104-48e7-80fb-a2294a277006/extract/0.log" Oct 02 13:02:12 crc kubenswrapper[4658]: I1002 13:02:12.805636 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/util/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.034577 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/pull/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.038878 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/pull/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.041110 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/util/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.247748 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/extract/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.276706 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/pull/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.281623 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dtv7rf_d3132797-270c-4510-9f55-754ad5e47f34/util/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.583392 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-utilities/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.813441 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-content/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.824964 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-utilities/0.log" Oct 02 13:02:13 crc kubenswrapper[4658]: I1002 13:02:13.825604 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-content/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.026179 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-utilities/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.062615 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/extract-content/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.228509 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-utilities/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.387753 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mm7ql_d41a29d7-3972-41e5-9ab4-fd44f44bc184/registry-server/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.438935 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-utilities/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.478842 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-content/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.502784 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-content/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.631893 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-utilities/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.651253 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/extract-content/0.log" Oct 02 13:02:14 crc kubenswrapper[4658]: I1002 13:02:14.920086 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/util/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.171386 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/util/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.198593 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/pull/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.198992 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/pull/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.403842 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7kxq7_8b9f70c5-a35e-43e4-9b22-41a924ab19f3/registry-server/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.413010 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/extract/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.418749 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/util/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.419466 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckxtrb_65876d37-f714-4df4-8631-442538f87981/pull/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.570271 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-clrx4_0cdd5f96-dd0d-4f77-8e41-83a8493dbca7/marketplace-operator/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.611455 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-utilities/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.813143 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-utilities/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.824440 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-content/0.log" Oct 02 13:02:15 crc kubenswrapper[4658]: I1002 13:02:15.842908 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-content/0.log" Oct 02 13:02:16 crc kubenswrapper[4658]: I1002 13:02:16.006378 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-content/0.log" Oct 02 13:02:16 crc kubenswrapper[4658]: I1002 13:02:16.011898 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/extract-utilities/0.log" Oct 02 13:02:16 crc kubenswrapper[4658]: I1002 13:02:16.113023 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-utilities/0.log" Oct 02 13:02:16 crc kubenswrapper[4658]: I1002 13:02:16.194601 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7v4fx_1cab9c15-8dc5-46cf-bb34-84ea996f0cc6/registry-server/0.log" Oct 02 13:02:16 crc kubenswrapper[4658]: I1002 13:02:16.306144 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-utilities/0.log" Oct 02 13:02:16 crc kubenswrapper[4658]: I1002 13:02:16.313654 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-content/0.log" Oct 02 13:02:16 crc kubenswrapper[4658]: I1002 13:02:16.341933 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-content/0.log" Oct 02 13:02:16 crc kubenswrapper[4658]: I1002 13:02:16.465014 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-utilities/0.log" Oct 02 13:02:16 crc kubenswrapper[4658]: I1002 13:02:16.506867 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/extract-content/0.log" Oct 02 13:02:17 crc kubenswrapper[4658]: I1002 13:02:17.164559 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-czdjc_4cf74ad0-2d22-4e96-a77f-0df6ee38dfde/registry-server/0.log" Oct 02 13:02:21 crc kubenswrapper[4658]: I1002 13:02:21.979941 4658 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qqzgh"] Oct 02 13:02:21 crc kubenswrapper[4658]: E1002 13:02:21.980864 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a543aaef-542e-4081-866c-c1393e20ea9e" containerName="extract-content" Oct 02 13:02:21 crc kubenswrapper[4658]: I1002 13:02:21.980880 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a543aaef-542e-4081-866c-c1393e20ea9e" containerName="extract-content" Oct 02 13:02:21 crc kubenswrapper[4658]: E1002 13:02:21.980922 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a543aaef-542e-4081-866c-c1393e20ea9e" containerName="extract-utilities" Oct 02 13:02:21 crc kubenswrapper[4658]: I1002 13:02:21.980930 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a543aaef-542e-4081-866c-c1393e20ea9e" containerName="extract-utilities" Oct 02 13:02:21 crc kubenswrapper[4658]: E1002 13:02:21.980960 4658 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a543aaef-542e-4081-866c-c1393e20ea9e" containerName="registry-server" Oct 02 13:02:21 crc kubenswrapper[4658]: I1002 13:02:21.980970 4658 state_mem.go:107] "Deleted CPUSet assignment" podUID="a543aaef-542e-4081-866c-c1393e20ea9e" containerName="registry-server" Oct 02 13:02:21 crc kubenswrapper[4658]: I1002 13:02:21.981203 4658 memory_manager.go:354] "RemoveStaleState removing state" podUID="a543aaef-542e-4081-866c-c1393e20ea9e" containerName="registry-server" Oct 02 13:02:21 crc kubenswrapper[4658]: I1002 13:02:21.983028 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:21 crc kubenswrapper[4658]: I1002 13:02:21.998098 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqzgh"] Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.079436 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfx99\" (UniqueName: \"kubernetes.io/projected/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-kube-api-access-vfx99\") pod \"community-operators-qqzgh\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.079598 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-catalog-content\") pod \"community-operators-qqzgh\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.080148 4658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-utilities\") pod \"community-operators-qqzgh\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.182634 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-utilities\") pod \"community-operators-qqzgh\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.182759 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfx99\" (UniqueName: \"kubernetes.io/projected/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-kube-api-access-vfx99\") pod \"community-operators-qqzgh\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.182806 4658 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-catalog-content\") pod \"community-operators-qqzgh\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.183315 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-utilities\") pod \"community-operators-qqzgh\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.183379 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-catalog-content\") pod \"community-operators-qqzgh\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.219150 4658 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfx99\" (UniqueName: \"kubernetes.io/projected/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-kube-api-access-vfx99\") pod \"community-operators-qqzgh\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.312380 4658 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:22 crc kubenswrapper[4658]: I1002 13:02:22.969329 4658 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqzgh"] Oct 02 13:02:23 crc kubenswrapper[4658]: I1002 13:02:23.904209 4658 generic.go:334] "Generic (PLEG): container finished" podID="4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5" containerID="c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae" exitCode=0 Oct 02 13:02:23 crc kubenswrapper[4658]: I1002 13:02:23.904318 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqzgh" event={"ID":"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5","Type":"ContainerDied","Data":"c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae"} Oct 02 13:02:23 crc kubenswrapper[4658]: I1002 13:02:23.904522 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqzgh" event={"ID":"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5","Type":"ContainerStarted","Data":"c3087830f885bf229ed6ee5b0c6f79b47549314293c0845b71b1d6a2a37ab148"} Oct 02 13:02:25 crc kubenswrapper[4658]: I1002 13:02:25.942843 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqzgh" event={"ID":"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5","Type":"ContainerStarted","Data":"83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1"} Oct 02 13:02:26 crc kubenswrapper[4658]: I1002 13:02:26.953990 4658 generic.go:334] "Generic (PLEG): container finished" podID="4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5" containerID="83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1" exitCode=0 Oct 02 13:02:26 crc kubenswrapper[4658]: I1002 13:02:26.954044 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqzgh" event={"ID":"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5","Type":"ContainerDied","Data":"83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1"} Oct 02 13:02:27 crc kubenswrapper[4658]: I1002 13:02:27.430130 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:02:27 crc kubenswrapper[4658]: I1002 13:02:27.430214 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:02:27 crc kubenswrapper[4658]: I1002 13:02:27.969077 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqzgh" event={"ID":"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5","Type":"ContainerStarted","Data":"4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76"} Oct 02 13:02:27 crc kubenswrapper[4658]: I1002 13:02:27.987802 4658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qqzgh" podStartSLOduration=3.406222805 podStartE2EDuration="6.987784468s" podCreationTimestamp="2025-10-02 13:02:21 +0000 UTC" firstStartedPulling="2025-10-02 13:02:23.906967265 +0000 UTC m=+6224.798120842" lastFinishedPulling="2025-10-02 13:02:27.488528898 +0000 UTC m=+6228.379682505" observedRunningTime="2025-10-02 13:02:27.987385485 +0000 UTC m=+6228.878539062" watchObservedRunningTime="2025-10-02 13:02:27.987784468 +0000 UTC m=+6228.878938035" Oct 02 13:02:29 crc kubenswrapper[4658]: I1002 13:02:29.410054 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-wxwrk_6abb4e77-380e-45f9-94dd-0511e0194885/prometheus-operator/0.log" Oct 02 13:02:29 crc kubenswrapper[4658]: I1002 13:02:29.591068 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7598d7fff9-42nhm_e8c24809-b49a-4a7d-9fd8-58f83c33a290/prometheus-operator-admission-webhook/0.log" Oct 02 13:02:29 crc kubenswrapper[4658]: I1002 13:02:29.646525 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7598d7fff9-kwt7g_8c427e03-4bb9-4dc4-a866-765e097e498f/prometheus-operator-admission-webhook/0.log" Oct 02 13:02:29 crc kubenswrapper[4658]: I1002 13:02:29.842392 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-b7497_6ae51e31-b742-4b5c-870a-d7bfc95151f1/operator/0.log" Oct 02 13:02:29 crc kubenswrapper[4658]: I1002 13:02:29.939757 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-4qk6g_79a71fa2-31f7-4ce5-9043-cdfad20543ec/perses-operator/0.log" Oct 02 13:02:32 crc kubenswrapper[4658]: I1002 13:02:32.313119 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:32 crc kubenswrapper[4658]: I1002 13:02:32.313770 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:32 crc kubenswrapper[4658]: I1002 13:02:32.401164 4658 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:33 crc kubenswrapper[4658]: I1002 13:02:33.053674 4658 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:33 crc kubenswrapper[4658]: I1002 13:02:33.093668 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qqzgh"] Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.042150 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qqzgh" podUID="4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5" containerName="registry-server" containerID="cri-o://4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76" gracePeriod=2 Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.602059 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.774311 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-catalog-content\") pod \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.774368 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfx99\" (UniqueName: \"kubernetes.io/projected/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-kube-api-access-vfx99\") pod \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.774468 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-utilities\") pod \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\" (UID: \"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5\") " Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.775883 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-utilities" (OuterVolumeSpecName: "utilities") pod "4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5" (UID: "4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.786454 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-kube-api-access-vfx99" (OuterVolumeSpecName: "kube-api-access-vfx99") pod "4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5" (UID: "4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5"). InnerVolumeSpecName "kube-api-access-vfx99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.826105 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5" (UID: "4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.877540 4658 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.877584 4658 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:35 crc kubenswrapper[4658]: I1002 13:02:35.877600 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfx99\" (UniqueName: \"kubernetes.io/projected/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5-kube-api-access-vfx99\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.052444 4658 generic.go:334] "Generic (PLEG): container finished" podID="4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5" containerID="4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76" exitCode=0 Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.052501 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqzgh" event={"ID":"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5","Type":"ContainerDied","Data":"4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76"} Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.052526 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqzgh" event={"ID":"4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5","Type":"ContainerDied","Data":"c3087830f885bf229ed6ee5b0c6f79b47549314293c0845b71b1d6a2a37ab148"} Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.052542 4658 scope.go:117] "RemoveContainer" containerID="4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.052547 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqzgh" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.075072 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qqzgh"] Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.083448 4658 scope.go:117] "RemoveContainer" containerID="83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.087359 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qqzgh"] Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.105502 4658 scope.go:117] "RemoveContainer" containerID="c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.158852 4658 scope.go:117] "RemoveContainer" containerID="4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76" Oct 02 13:02:36 crc kubenswrapper[4658]: E1002 13:02:36.159997 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76\": container with ID starting with 4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76 not found: ID does not exist" containerID="4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.160027 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76"} err="failed to get container status \"4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76\": rpc error: code = NotFound desc = could not find container \"4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76\": container with ID starting with 4202826ff12ecf0431cf91292856192530e647191b6c1b33cb7881634b157a76 not found: ID does not exist" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.160047 4658 scope.go:117] "RemoveContainer" containerID="83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1" Oct 02 13:02:36 crc kubenswrapper[4658]: E1002 13:02:36.160484 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1\": container with ID starting with 83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1 not found: ID does not exist" containerID="83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.160507 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1"} err="failed to get container status \"83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1\": rpc error: code = NotFound desc = could not find container \"83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1\": container with ID starting with 83dff5f34c98bb71ab7f633ca4dc3d66fea5ef9b7dfd295768e40ecd696339f1 not found: ID does not exist" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.160521 4658 scope.go:117] "RemoveContainer" containerID="c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae" Oct 02 13:02:36 crc kubenswrapper[4658]: E1002 13:02:36.166414 4658 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae\": container with ID starting with c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae not found: ID does not exist" containerID="c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae" Oct 02 13:02:36 crc kubenswrapper[4658]: I1002 13:02:36.166438 4658 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae"} err="failed to get container status \"c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae\": rpc error: code = NotFound desc = could not find container \"c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae\": container with ID starting with c2b4bff04c50e163cb1d96b66cd3c264171ec8d293392f4c8d33dd7c085fe7ae not found: ID does not exist" Oct 02 13:02:37 crc kubenswrapper[4658]: I1002 13:02:37.981462 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5" path="/var/lib/kubelet/pods/4f275575-f0c5-4ea4-a3a8-6bc3959e2ec5/volumes" Oct 02 13:02:47 crc kubenswrapper[4658]: E1002 13:02:47.445668 4658 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.32:56552->38.102.83.32:41677: write tcp 38.102.83.32:56552->38.102.83.32:41677: write: broken pipe Oct 02 13:02:57 crc kubenswrapper[4658]: I1002 13:02:57.430393 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:02:57 crc kubenswrapper[4658]: I1002 13:02:57.430983 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:02:57 crc kubenswrapper[4658]: I1002 13:02:57.431037 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 13:02:57 crc kubenswrapper[4658]: I1002 13:02:57.431978 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af02ac7a68f71b372c931df2b7dcf958edacd7a1fa88322aa09952c29cef20b7"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:02:57 crc kubenswrapper[4658]: I1002 13:02:57.432088 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://af02ac7a68f71b372c931df2b7dcf958edacd7a1fa88322aa09952c29cef20b7" gracePeriod=600 Oct 02 13:02:58 crc kubenswrapper[4658]: I1002 13:02:58.342820 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="af02ac7a68f71b372c931df2b7dcf958edacd7a1fa88322aa09952c29cef20b7" exitCode=0 Oct 02 13:02:58 crc kubenswrapper[4658]: I1002 13:02:58.342908 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"af02ac7a68f71b372c931df2b7dcf958edacd7a1fa88322aa09952c29cef20b7"} Oct 02 13:02:58 crc kubenswrapper[4658]: I1002 13:02:58.343157 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerStarted","Data":"a775be452f96b92b3e8b9528d3cce914a65666dbbc75cf49dd05267b27e1cbb0"} Oct 02 13:02:58 crc kubenswrapper[4658]: I1002 13:02:58.343175 4658 scope.go:117] "RemoveContainer" containerID="7661ba3bcc35fb1c1067dd96226b41215cbd47cae3d5f8fad7a8b92aab624600" Oct 02 13:04:45 crc kubenswrapper[4658]: I1002 13:04:45.488051 4658 generic.go:334] "Generic (PLEG): container finished" podID="75a3fbaf-6798-43b3-a912-fb1afa675811" containerID="47a4ad3fed7b9364d72a9b95cfe0e1328567085514ae69406aa758e646c1dd3c" exitCode=0 Oct 02 13:04:45 crc kubenswrapper[4658]: I1002 13:04:45.488167 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" event={"ID":"75a3fbaf-6798-43b3-a912-fb1afa675811","Type":"ContainerDied","Data":"47a4ad3fed7b9364d72a9b95cfe0e1328567085514ae69406aa758e646c1dd3c"} Oct 02 13:04:45 crc kubenswrapper[4658]: I1002 13:04:45.489448 4658 scope.go:117] "RemoveContainer" containerID="47a4ad3fed7b9364d72a9b95cfe0e1328567085514ae69406aa758e646c1dd3c" Oct 02 13:04:45 crc kubenswrapper[4658]: I1002 13:04:45.923429 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5fsxr_must-gather-n4cj7_75a3fbaf-6798-43b3-a912-fb1afa675811/gather/0.log" Oct 02 13:04:57 crc kubenswrapper[4658]: I1002 13:04:57.430000 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:04:57 crc kubenswrapper[4658]: I1002 13:04:57.430629 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.439502 4658 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5fsxr/must-gather-n4cj7"] Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.440406 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" podUID="75a3fbaf-6798-43b3-a912-fb1afa675811" containerName="copy" containerID="cri-o://66f606f3febc06db27b903e58569c21c86dd0528ba37027567cbe73a9ee18bee" gracePeriod=2 Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.450073 4658 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5fsxr/must-gather-n4cj7"] Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.629663 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5fsxr_must-gather-n4cj7_75a3fbaf-6798-43b3-a912-fb1afa675811/copy/0.log" Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.630575 4658 generic.go:334] "Generic (PLEG): container finished" podID="75a3fbaf-6798-43b3-a912-fb1afa675811" containerID="66f606f3febc06db27b903e58569c21c86dd0528ba37027567cbe73a9ee18bee" exitCode=143 Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.853757 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5fsxr_must-gather-n4cj7_75a3fbaf-6798-43b3-a912-fb1afa675811/copy/0.log" Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.854080 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.959720 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlcnb\" (UniqueName: \"kubernetes.io/projected/75a3fbaf-6798-43b3-a912-fb1afa675811-kube-api-access-wlcnb\") pod \"75a3fbaf-6798-43b3-a912-fb1afa675811\" (UID: \"75a3fbaf-6798-43b3-a912-fb1afa675811\") " Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.959823 4658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a3fbaf-6798-43b3-a912-fb1afa675811-must-gather-output\") pod \"75a3fbaf-6798-43b3-a912-fb1afa675811\" (UID: \"75a3fbaf-6798-43b3-a912-fb1afa675811\") " Oct 02 13:04:59 crc kubenswrapper[4658]: I1002 13:04:59.965065 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a3fbaf-6798-43b3-a912-fb1afa675811-kube-api-access-wlcnb" (OuterVolumeSpecName: "kube-api-access-wlcnb") pod "75a3fbaf-6798-43b3-a912-fb1afa675811" (UID: "75a3fbaf-6798-43b3-a912-fb1afa675811"). InnerVolumeSpecName "kube-api-access-wlcnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:05:00 crc kubenswrapper[4658]: I1002 13:05:00.061435 4658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlcnb\" (UniqueName: \"kubernetes.io/projected/75a3fbaf-6798-43b3-a912-fb1afa675811-kube-api-access-wlcnb\") on node \"crc\" DevicePath \"\"" Oct 02 13:05:00 crc kubenswrapper[4658]: I1002 13:05:00.164605 4658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75a3fbaf-6798-43b3-a912-fb1afa675811-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "75a3fbaf-6798-43b3-a912-fb1afa675811" (UID: "75a3fbaf-6798-43b3-a912-fb1afa675811"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:05:00 crc kubenswrapper[4658]: I1002 13:05:00.166502 4658 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a3fbaf-6798-43b3-a912-fb1afa675811-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 13:05:00 crc kubenswrapper[4658]: I1002 13:05:00.651365 4658 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5fsxr_must-gather-n4cj7_75a3fbaf-6798-43b3-a912-fb1afa675811/copy/0.log" Oct 02 13:05:00 crc kubenswrapper[4658]: I1002 13:05:00.652044 4658 scope.go:117] "RemoveContainer" containerID="66f606f3febc06db27b903e58569c21c86dd0528ba37027567cbe73a9ee18bee" Oct 02 13:05:00 crc kubenswrapper[4658]: I1002 13:05:00.652210 4658 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5fsxr/must-gather-n4cj7" Oct 02 13:05:00 crc kubenswrapper[4658]: I1002 13:05:00.681749 4658 scope.go:117] "RemoveContainer" containerID="47a4ad3fed7b9364d72a9b95cfe0e1328567085514ae69406aa758e646c1dd3c" Oct 02 13:05:01 crc kubenswrapper[4658]: I1002 13:05:01.969417 4658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a3fbaf-6798-43b3-a912-fb1afa675811" path="/var/lib/kubelet/pods/75a3fbaf-6798-43b3-a912-fb1afa675811/volumes" Oct 02 13:05:26 crc kubenswrapper[4658]: I1002 13:05:26.260702 4658 scope.go:117] "RemoveContainer" containerID="906e94a8ae08eabf4cebdbb254079d0d22d6ef6beec0f6f90580006348d3ee4b" Oct 02 13:05:27 crc kubenswrapper[4658]: I1002 13:05:27.430276 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:05:27 crc kubenswrapper[4658]: I1002 13:05:27.430738 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:05:57 crc kubenswrapper[4658]: I1002 13:05:57.430213 4658 patch_prober.go:28] interesting pod/machine-config-daemon-pnjp5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:05:57 crc kubenswrapper[4658]: I1002 13:05:57.430875 4658 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:05:57 crc kubenswrapper[4658]: I1002 13:05:57.430936 4658 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" Oct 02 13:05:57 crc kubenswrapper[4658]: I1002 13:05:57.431906 4658 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a775be452f96b92b3e8b9528d3cce914a65666dbbc75cf49dd05267b27e1cbb0"} pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:05:57 crc kubenswrapper[4658]: I1002 13:05:57.431992 4658 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerName="machine-config-daemon" containerID="cri-o://a775be452f96b92b3e8b9528d3cce914a65666dbbc75cf49dd05267b27e1cbb0" gracePeriod=600 Oct 02 13:05:57 crc kubenswrapper[4658]: E1002 13:05:57.555097 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 13:05:58 crc kubenswrapper[4658]: I1002 13:05:58.301747 4658 generic.go:334] "Generic (PLEG): container finished" podID="53173b86-be4f-4b39-8f70-f7282ab529fb" containerID="a775be452f96b92b3e8b9528d3cce914a65666dbbc75cf49dd05267b27e1cbb0" exitCode=0 Oct 02 13:05:58 crc kubenswrapper[4658]: I1002 13:05:58.301811 4658 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" event={"ID":"53173b86-be4f-4b39-8f70-f7282ab529fb","Type":"ContainerDied","Data":"a775be452f96b92b3e8b9528d3cce914a65666dbbc75cf49dd05267b27e1cbb0"} Oct 02 13:05:58 crc kubenswrapper[4658]: I1002 13:05:58.302093 4658 scope.go:117] "RemoveContainer" containerID="af02ac7a68f71b372c931df2b7dcf958edacd7a1fa88322aa09952c29cef20b7" Oct 02 13:05:58 crc kubenswrapper[4658]: I1002 13:05:58.303273 4658 scope.go:117] "RemoveContainer" containerID="a775be452f96b92b3e8b9528d3cce914a65666dbbc75cf49dd05267b27e1cbb0" Oct 02 13:05:58 crc kubenswrapper[4658]: E1002 13:05:58.303906 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 13:06:10 crc kubenswrapper[4658]: I1002 13:06:10.949906 4658 scope.go:117] "RemoveContainer" containerID="a775be452f96b92b3e8b9528d3cce914a65666dbbc75cf49dd05267b27e1cbb0" Oct 02 13:06:10 crc kubenswrapper[4658]: E1002 13:06:10.950863 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 13:06:21 crc kubenswrapper[4658]: I1002 13:06:21.949548 4658 scope.go:117] "RemoveContainer" containerID="a775be452f96b92b3e8b9528d3cce914a65666dbbc75cf49dd05267b27e1cbb0" Oct 02 13:06:21 crc kubenswrapper[4658]: E1002 13:06:21.950659 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb" Oct 02 13:06:35 crc kubenswrapper[4658]: I1002 13:06:35.950408 4658 scope.go:117] "RemoveContainer" containerID="a775be452f96b92b3e8b9528d3cce914a65666dbbc75cf49dd05267b27e1cbb0" Oct 02 13:06:35 crc kubenswrapper[4658]: E1002 13:06:35.951653 4658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pnjp5_openshift-machine-config-operator(53173b86-be4f-4b39-8f70-f7282ab529fb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pnjp5" podUID="53173b86-be4f-4b39-8f70-f7282ab529fb"